WorldWideScience

Sample records for computer tasks quantified

  1. Eye blink frequency during different computer tasks quantified by electrooculography

    DEFF Research Database (Denmark)

    Skotte, J H; Nøjgaard, J K; Jørgensen, L V

    2007-01-01

    The purpose of the study was to evaluate electrooculography (EOG) as an automatic method to measure the human eye blink frequency (BF) during passive and interactive computer tasks performed at two screen heights. Ten healthy subjects (5 males and 5 females) participated in the study in a 23...... degrees C temperature and 30-35% relative humidity controlled simulated office environment. Each test subject completed a 2 x 10 min active task of computer work and a 3 x 10 min passive task of watching a film on a video display unit (VDU). Both tasks included two viewing angles: standard (the monitors...... counted manually from the video recordings and compared to the EOG measurements. The method showed a high validity to detect blinks during computer work: 95.4% of the blinks were retrieved by the EOG method and very few artefacts from eye movements were erroneously classified as eye blinks (2.4%). By use...

  2. Computer task performance by subjects with Duchenne muscular dystrophy.

    Science.gov (United States)

    Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira

    2016-01-01

    Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.

  3. Computer-Related Task Performance

    DEFF Research Database (Denmark)

    Longstreet, Phil; Xiao, Xiao; Sarker, Saonee

    2016-01-01

    The existing information system (IS) literature has acknowledged computer self-efficacy (CSE) as an important factor contributing to enhancements in computer-related task performance. However, the empirical results of CSE on performance have not always been consistent, and increasing an individual......'s CSE is often a cumbersome process. Thus, we introduce the theoretical concept of self-prophecy (SP) and examine how this social influence strategy can be used to improve computer-related task performance. Two experiments are conducted to examine the influence of SP on task performance. Results show...... that SP and CSE interact to influence performance. Implications are then discussed in terms of organizations’ ability to increase performance....

  4. The Composite Strain Index (COSI) and Cumulative Strain Index (CUSI): methodologies for quantifying biomechanical stressors for complex tasks and job rotation using the Revised Strain Index.

    Science.gov (United States)

    Garg, Arun; Moore, J Steven; Kapellusch, Jay M

    2017-08-01

    The Composite Strain Index (COSI) quantifies biomechanical stressors for complex tasks consisting of exertions at different force levels and/or with different exertion times. The Cumulative Strain Index (CUSI) further integrates biomechanical stressors from different tasks to quantify exposure for the entire work shift. The paper provides methodologies to compute COSI and CUSI along with examples. Complex task simulation produced 169,214 distinct tasks. Use of average, time-weighted average (TWA) and peak force and COSI classified 66.9, 28.2, 100 and 38.9% of tasks as hazardous, respectively. For job rotation the simulation produced 10,920 distinct jobs. TWA COSI, peak task COSI and CUSI classified 36.5, 78.1 and 66.6% jobs as hazardous, respectively. The results suggest that the TWA approach systematically underestimates the biomechanical stressors and peak approach overestimates biomechanical stressors, both at the task and job level. It is believed that the COSI and CUSI partially address these underestimations and overestimations of biomechanical stressors. Practitioner Summary: COSI quantifies exposure when applied hand force and/or duration of that force changes during a task cycle. CUSI integrates physical exposures from job rotation. These should be valuable tools for designing and analysing tasks and job rotation to determine risk of musculoskeletal injuries.

  5. LHCb computing tasks

    CERN Document Server

    Binko, P

    1998-01-01

    This document describes the computing tasks of the LHCb computing system. It also describes the logistics of the dataflow between the tasks and the detailed requirements for each task, in particular the data sizes and CPU power requirements. All data sizes are calculated assuming that the LHCb experiment will take data about 107 s per year at a frequency of 200 Hz, which gives 2 \\Theta 109 real events per year. The raw event size should not exceed 100 kB (200 TB per year). We will have to generate about 109 MonteCarlo events per year. The current MonteCarlo simulation program based on the GEANT3.21 package requires about 12 s to produce an average event (all CPU times are normalised to a 1000 MIPS processor). The size of an average MonteCarlo event will be about 200 kB (100 TB per year) of simulated data (without the hits). We will start to use the GEANT4 package in 1998. Rejection factors of 8 and 25 are required in the Level-2 and Level-3 triggers respectively, to reduce the frequency of events to 200 Hz. T...

  6. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    Science.gov (United States)

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  7. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    Science.gov (United States)

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  8. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  9. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention.

    Science.gov (United States)

    Silva, Alessandro P; Frère, Annie F

    2011-08-19

    Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. An interactive computer game based on virtual reality was developed to evaluate the performance of the players.The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected

  10. Coherence and computational complexity of quantifier-free dependence logic formulas

    NARCIS (Netherlands)

    Kontinen, J.; Kontinen, J.; Väänänen, J.

    2010-01-01

    We study the computational complexity of the model checking for quantifier-free dependence logic (D) formulas. We point out three thresholds in the computational complexity: logarithmic space, non- deterministic logarithmic space and non-deterministic polynomial time.

  11. Task and Interruption Management in Activity-Centric Computing

    DEFF Research Database (Denmark)

    Jeuris, Steven

    to address these not in isolation, but by fundamentally reevaluating the current computing paradigm. To this end, activity-centric computing has been brought forward as an alternative computing paradigm, addressing the increasing strain put on modern-day computing systems. Activity-centric computing follows...... the scalability and intelligibility of current research prototypes. In this dissertation, I postulate that such issues arise due to a lack of support for the full set of practices which make up activity management. Most notably, although task and interruption management are an integral part of personal...... information management, they have thus far been neglected in prior activity-centric computing systems. Advancing the research agenda of activity-centric computing, I (1) implement and evaluate an activity-centric desktop computing system, incorporating support for interruptions and long-term task management...

  12. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  13. Quantifying the Physiological Stress Response to Simulated Maritime Pilotage Tasks: The Influence of Task Complexity and Pilot Experience.

    Science.gov (United States)

    Main, Luana C; Wolkow, Alexander; Chambers, Timothy P

    2017-11-01

    The aim of this study was to quantify the stress associated with performing maritime pilotage tasks in a high-fidelity simulator. Eight trainee and 13 maritime pilots completed two simulated pilotage tasks of varying complexity. Salivary cortisol samples were collected pre- and post-simulation for both trials. Heart rate was measured continuously throughout the study. Significant changes in salivary cortisol (P = 0.000, η = 0.139), average (P = 0.006, η = 0.087), and peak heart rate (P = 0.013, η = 0.077) from pre- to postsimulation were found. Varying task complexity did partially influence stress response; average (P = 0.016, η = 0.026) and peak heart rate (P = 0.034, η = 0.020) were higher in the experimental condition. Trainees also recorded higher average (P = 0.000, η = 0.054) and peak heart rates (P = 0.027, η = 0.022). Performing simulated pilotage tasks evoked a measurable stress response in both trainee and expert maritime pilots.

  14. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention

    Directory of Open Access Journals (Sweden)

    Frère Annie F

    2011-08-01

    Full Text Available Abstract Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive computer game based on virtual reality was developed to evaluate the performance of the players. The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2 and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Results Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants

  15. PARTICAL SWARM OPTIMIZATION OF TASK SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    Payal Jaglan*, Chander Diwakar

    2016-01-01

    Resource provisioning and pricing modeling in cloud computing makes it an inevitable technology both on developer and consumer end. Easy accessibility of software and freedom of hardware configuration increase its demand in IT industry. It’s ability to provide a user-friendly environment, software independence, quality, pricing index and easy accessibility of infrastructure via internet. Task scheduling plays an important role in cloud computing systems. Task scheduling in cloud computing mea...

  16. Report of the Task Force on Computer Charging.

    Science.gov (United States)

    Computer Co-ordination Group, Ottawa (Ontario).

    The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…

  17. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  18. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    International Nuclear Information System (INIS)

    Dolly, S; Mutic, S; Anastasio, M; Li, H; Yu, L

    2016-01-01

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework was developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation

  19. Quantified measurement of brain blood volume: comparative evaluations between the single photon emission computer tomography and the positron computer tomography

    International Nuclear Information System (INIS)

    Bouvard, G.; Fernandez, Y.; Petit-Taboue, M.C.; Derlon, J.M.; Travere, J.M.; Le Poec, C.

    1991-01-01

    The quantified measurement of cerebral blood volume is interesting for the brain blood circulation studies. This measurement is often used in positron computed tomography. It's more difficult in single photon emission computed tomography: there are physical problems with the limited resolution of the detector, the Compton effect and the photon attenuation. The objectif of this study is to compare the results between these two techniques. The quantified measurement of brain blood volume is possible with the single photon emission computer tomogragry. However, there is a loss of contrast [fr

  20. Optimal usage of computing grid network in the fields of nuclear fusion computing task

    International Nuclear Information System (INIS)

    Tenev, D.

    2006-01-01

    Nowadays the nuclear power becomes the main source of energy. To make its usage more efficient, the scientists created complicated simulation models, which require powerful computers. The grid computing is the answer to powerful and accessible computing resources. The article observes, and estimates the optimal configuration of the grid environment in the fields of the complicated nuclear fusion computing tasks. (author)

  1. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  2. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    Science.gov (United States)

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  3. Task conflict and proactive control: A computational theory of the Stroop task.

    Science.gov (United States)

    Kalanthroff, Eyal; Davelaar, Eddy J; Henik, Avishai; Goldfarb, Liat; Usher, Marius

    2018-01-01

    The Stroop task is a central experimental paradigm used to probe cognitive control by measuring the ability of participants to selectively attend to task-relevant information and inhibit automatic task-irrelevant responses. Research has revealed variability in both experimental manipulations and individual differences. Here, we focus on a particular source of Stroop variability, the reverse-facilitation (RF; faster responses to nonword neutral stimuli than to congruent stimuli), which has recently been suggested as a signature of task conflict. We first review the literature that shows RF variability in the Stroop task, both with regard to experimental manipulations and to individual differences. We suggest that task conflict variability can be understood as resulting from the degree of proactive control that subjects recruit in advance of the Stroop stimulus. When the proactive control is high, task conflict does not arise (or is resolved very quickly), resulting in regular Stroop facilitation. When proactive control is low, task conflict emerges, leading to a slow-down in congruent and incongruent (but not in neutral) trials and thus to Stroop RF. To support this suggestion, we present a computational model of the Stroop task, which includes the resolution of task conflict and its modulation by proactive control. Results show that our model (a) accounts for the variability in Stroop-RF reported in the experimental literature, and (b) solves a challenge to previous Stroop models-their ability to account for reaction time distributional properties. Finally, we discuss theoretical implications to Stroop measures and control deficits observed in some psychopathologies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Computational tasks in robotics and factory automation

    NARCIS (Netherlands)

    Biemans, Frank P.; Vissers, C.A.

    1988-01-01

    The design of Manufacturing Planning and Control Systems (MPCSs) — systems that negotiate with Customers and Suppliers to exchange products in return for money in order to generate profit, is discussed. The computational task of MPCS components are systematically specified as a starting point for

  5. The importance of task appropriateness in computer-supported collaborative learning

    Directory of Open Access Journals (Sweden)

    Kathy Buckner

    1999-12-01

    Full Text Available The study of learning in collaborative electronic environments is becoming established as Computer Supported Collaborative Learning (CSCL - an emergent sub-discipline of the more established Computer Supported Co-operative Work (CSCW discipline (Webb, 1995. Using computers for the development of shared understanding through collaboration has been explored by Crook who suggests that success may depend partly on having a clearly specified purpose or goal (Crook, 1994. It is our view that the appropriateness of the task given to the student is central to the success or otherwise of the learning experience. However, the tasks that are given to facilitate collaborative learning in face-toface situations are not always suitable for direct transfer to the electronic medium. It may be necessary to consider redesigning these tasks in relation to the medium in which they are to be undertaken and the functionality of the electronic conferencing software used.

  6. Sort-Mid tasks scheduling algorithm in grid computing.

    Science.gov (United States)

    Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M

    2015-11-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.

  7. The ability of non-computer tasks to increase biomechanical exposure variability in computer-intensive office work.

    Science.gov (United States)

    Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Nogueira, Helen Cristina; Oliveira, Ana Beatriz

    2015-01-01

    Postures and muscle activity in the upper body were recorded from 50 academics office workers during 2 hours of normal work, categorised by observation into computer work (CW) and three non-computer (NC) tasks (NC seated work, NC standing/walking work and breaks). NC tasks differed significantly in exposures from CW, with standing/walking NC tasks representing the largest contrasts for most of the exposure variables. For the majority of workers, exposure variability was larger in their present job than in CW alone, as measured by the job variance ratio (JVR), i.e. the ratio between min-min variabilities in the job and in CW. Calculations of JVRs for simulated jobs containing different proportions of CW showed that variability could, indeed, be increased by redistributing available tasks, but that substantial increases could only be achieved by introducing more vigorous tasks in the job, in casu illustrated by cleaning.

  8. Task-induced frequency modulation features for brain-computer interfacing.

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects' intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects' intents with an accuracy comparable to task-induced amplitude modulation. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  9. Task-induced frequency modulation features for brain-computer interfacing

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Objective. Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects’ intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects’ intents with an accuracy comparable to task-induced amplitude modulation. Approach. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. Main results. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Significance. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  10. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2010-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  11. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2009-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  12. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  13. Sort-Mid tasks scheduling algorithm in grid computing

    Directory of Open Access Journals (Sweden)

    Naglaa M. Reda

    2015-11-01

    Full Text Available Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.

  14. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  15. Task-and-role-based access-control model for computational grid

    Institute of Scientific and Technical Information of China (English)

    LONG Tao; HONG Fan; WU Chi; SUN Ling-li

    2007-01-01

    Access control in a grid environment is a challenging issue because the heterogeneous nature and independent administration of geographically dispersed resources in grid require access control to use fine-grained policies. We established a task-and-role-based access-control model for computational grid (CG-TRBAC model), integrating the concepts of role-based access control (RBAC) and task-based access control (TBAC). In this model, condition restrictions are defined and concepts specifically tailored to Workflow Management System are simplified or omitted so that role assignment and security administration fit computational grid better than traditional models; permissions are mutable with the task status and system variables, and can be dynamically controlled. The CG-TRBAC model is proved flexible and extendible. It can implement different control policies. It embodies the security principle of least privilege and executes active dynamic authorization. A task attribute can be extended to satisfy different requirements in a real grid system.

  16. A Computational Approach to Quantifiers as an Explanation for Some Language Impairments in Schizophrenia

    Science.gov (United States)

    Zajenkowski, Marcin; Styla, Rafal; Szymanik, Jakub

    2011-01-01

    We compared the processing of natural language quantifiers in a group of patients with schizophrenia and a healthy control group. In both groups, the difficulty of the quantifiers was consistent with computational predictions, and patients with schizophrenia took more time to solve the problems. However, they were significantly less accurate only…

  17. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    Science.gov (United States)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  18. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  19. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  20. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  1. Cloud Computing Task Scheduling Based on Cultural Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Jian-Wen

    2016-01-01

    Full Text Available The task scheduling strategy based on cultural genetic algorithm(CGA is proposed in order to improve the efficiency of task scheduling in the cloud computing platform, which targets at minimizing the total time and cost of task scheduling. The improved genetic algorithm is used to construct the main population space and knowledge space under cultural framework which get independent parallel evolution, forming a mechanism of mutual promotion to dispatch the cloud task. Simultaneously, in order to prevent the defects of the genetic algorithm which is easy to fall into local optimum, the non-uniform mutation operator is introduced to improve the search performance of the algorithm. The experimental results show that CGA reduces the total time and lowers the cost of the scheduling, which is an effective algorithm for the cloud task scheduling.

  2. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  3. The task-to-task communication between computers

    International Nuclear Information System (INIS)

    Lin Shuzi; Zhang Bingyun; Zhao Weiren

    1992-01-01

    The task-to-task communication is used in the Institute of High Energy Physics. The BES (Beijing Spectrometer) uses the communication mode to take some of the BEPC (Beijing Electron Positron Collider) running parameters needed by BES experiments in a periodic time. The authors describe the principle of transparent task-to-task communication and how to use it in BES on-line data acquisition system

  4. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  5. Exploring methodological frameworks for a mental task-based near-infrared spectroscopy brain-computer interface.

    Science.gov (United States)

    Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom

    2015-10-30

    Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.

    Science.gov (United States)

    Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge

    2015-01-01

    Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.

  7. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    Science.gov (United States)

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  8. Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?

    Directory of Open Access Journals (Sweden)

    Markus A Wenzel

    Full Text Available Brain-computer interfaces (BCIs that are based on event-related potentials (ERPs can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG. Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI, because it would allow software to adapt to the user's interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli.Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions.Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG.The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.

  9. Sympathetic nervous system activity measured by skin conductance quantifies the challenge of walking adaptability tasks after stroke.

    Science.gov (United States)

    Clark, David J; Chatterjee, Sudeshna A; McGuirk, Theresa E; Porges, Eric C; Fox, Emily J; Balasubramanian, Chitralakshmi K

    2018-02-01

    Walking adaptability tasks are challenging for people with motor impairments. The construct of perceived challenge is typically measured by self-report assessments, which are susceptible to subjective measurement error. The development of an objective physiologically-based measure of challenge may help to improve the ability to assess this important aspect of mobility function. The objective of this study to investigate the use of sympathetic nervous system (SNS) activity measured by skin conductance to gauge the physiological stress response to challenging walking adaptability tasks in people post-stroke. Thirty adults with chronic post-stroke hemiparesis performed a battery of seventeen walking adaptability tasks. SNS activity was measured by skin conductance from the palmar surface of each hand. The primary outcome variable was the percent change in skin conductance level (ΔSCL) between the baseline resting and walking phases of each task. Task difficulty was measured by performance speed and by physical therapist scoring of performance. Walking function and balance confidence were measured by preferred walking speed and the Activities-specific Balance Confidence Scale, respectively. There was a statistically significant negative association between ΔSCL and task performance speed and between ΔSCL and clinical score, indicating that tasks with greater SNS activity had slower performance speed and poorer clinical scores. ΔSCL was significantly greater for low functioning participants versus high functioning participants, particularly during the most challenging walking adaptability tasks. This study supports the use of SNS activity measured by skin conductance as a valuable approach for objectively quantifying the perceived challenge of walking adaptability tasks in people post-stroke. Published by Elsevier B.V.

  10. Software designs of image processing tasks with incremental refinement of computation.

    Science.gov (United States)

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  11. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    OpenAIRE

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s mul...

  12. An Adaptive Procedure for Task Scheduling Optimization in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Pham Phuoc Hung

    2015-01-01

    Full Text Available Nowadays, mobile cloud computing (MCC has emerged as a new paradigm which enables offloading computation-intensive, resource-consuming tasks up to a powerful computing platform in cloud, leaving only simple jobs to the capacity-limited thin client devices such as smartphones, tablets, Apple’s iWatch, and Google Glass. However, it still faces many challenges due to inherent problems of thin clients, especially the slow processing and low network connectivity. So far, a number of research studies have been carried out, trying to eliminate these problems, yet few have been found efficient. In this paper, we present an enhanced architecture, taking advantage of collaboration of thin clients and conventional desktop or laptop computers, known as thick clients, particularly aiming at improving cloud access. Additionally, we introduce an innovative genetic approach for task scheduling such that the processing time is minimized, while considering network contention and cloud cost. Our simulation shows that the proposed approach is more cost-effective and achieves better performance compared with others.

  13. A Hybrid Scheduler for Many Task Computing in Big Data Systems

    Directory of Open Access Journals (Sweden)

    Vasiliu Laura

    2017-06-01

    Full Text Available With the rapid evolution of the distributed computing world in the last few years, the amount of data created and processed has fast increased to petabytes or even exabytes scale. Such huge data sets need data-intensive computing applications and impose performance requirements to the infrastructures that support them, such as high scalability, storage, fault tolerance but also efficient scheduling algorithms. This paper focuses on providing a hybrid scheduling algorithm for many task computing that addresses big data environments with few penalties, taking into consideration the deadlines and satisfying a data dependent task model. The hybrid solution consists of several heuristics and algorithms (min-min, min-max and earliest deadline first combined in order to provide a scheduling algorithm that matches our problem. The experimental results are conducted by simulation and prove that the proposed hybrid algorithm behaves very well in terms of meeting deadlines.

  14. Autonomic Modulation in Duchenne Muscular Dystrophy during a Computer Task: A Prospective Control Trial.

    Directory of Open Access Journals (Sweden)

    Mayra Priscila Boscolo Alvarez

    Full Text Available Duchenne Muscular Dystrophy (DMD is characterized by progressive muscle weakness that can lead to disability. Owing to functional difficulties faced by individuals with DMD, the use of assistive technology is essential to provide or facilitate functional abilities. In DMD, cardiac autonomic dysfunction has been reported in addition to musculoskeletal impairment. Consequently, the objective was to investigate acute cardiac autonomic responses, by Heart Rate Variability (HRV, during computer tasks in subjects with DMD.HRV was assessed by linear and nonlinear methods, using the heart rate monitor Polar RS800CX chest strap Electrocardiographic measuring device. Then, 45 subjects were included in the group with DMD and 45 in the healthy Typical Development (TD control group. They were assessed for twenty minutes at rest sitting, and five minutes after undergoing a task on the computer.Individuals with DMD had a statistically significant lower parasympathetic cardiac modulation at rest when compared to the control group, which further declined when undergoing the tasks on the computer.DMD patients presented decreased HRV and exhibited greater intensity of cardiac autonomic responses during computer tasks characterized by vagal withdrawal when compared to the healthy TD control subjects.

  15. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task

    Science.gov (United States)

    Revechkis, Boris; Aflalo, Tyson NS; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A.

    2014-12-01

    Objective. To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. Approach. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like ‘Face in a Crowd’ task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the ‘Crowd’) using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a ‘Crowd Off’ condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Main results. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Significance. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet

  16. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task.

    Science.gov (United States)

    Revechkis, Boris; Aflalo, Tyson N S; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A

    2014-12-01

    To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like 'Face in a Crowd' task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the 'Crowd') using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a 'Crowd Off' condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.

  17. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks.

    Science.gov (United States)

    Devi, D Chitra; Uthariaraj, V Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  18. APPLICATION OF COMPUTER-AIDED TOMOGRAPHY TO VISUALIZE AND QUANTIFY BIOGENIC STRUCTURES IN MARINE SEDIMENTS

    Science.gov (United States)

    We used computer-aided tomography (CT) for 3D visualization and 2D analysis ofmarine sediment cores from 3 stations (at 10, 75 and 118 m depths) with different environmentalimpact. Biogenic structures such as tubes and burrows were quantified and compared among st...

  19. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  20. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  1. Effect of aging on performance, muscle activation and perceived stress during mentally demanding computer tasks

    DEFF Research Database (Denmark)

    Alkjaer, Tine; Pilegaard, Marianne; Bakke, Merete

    2005-01-01

    OBJECTIVES: This study examined the effects of age on performance, muscle activation, and perceived stress during computer tasks with different levels of mental demand. METHODS: Fifteen young and thirteen elderly women performed two computer tasks [color word test and reference task] with different...... levels of mental demand but similar physical demands. The performance (clicking frequency, percentage of correct answers, and response time for correct answers) and electromyography from the forearm, shoulder, and neck muscles were recorded. Visual analogue scales were used to measure the participants......' perception of the stress and difficulty related to the tasks. RESULTS: Performance decreased significantly in both groups during the color word test in comparison with performance on the reference task. However, the performance reduction was more pronounced in the elderly group than in the young group...

  2. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    Directory of Open Access Journals (Sweden)

    D. Chitra Devi

    2016-01-01

    Full Text Available Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM, the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  3. 29 CFR 778.313 - Computing overtime pay under the Act for employees compensated on task basis.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Computing overtime pay under the Act for employees compensated on task basis. 778.313 Section 778.313 Labor Regulations Relating to Labor (Continued) WAGE AND... TO REGULATIONS OVERTIME COMPENSATION Special Problems âtaskâ Basis of Payment § 778.313 Computing...

  4. A study on optimal task decomposition of networked parallel computing using PVM

    International Nuclear Information System (INIS)

    Seong, Kwan Jae; Kim, Han Gyoo

    1998-01-01

    A numerical study is performed to investigate the effect of task decomposition on networked parallel processes using Parallel Virtual Machine (PVM). In our study, a PVM program distributed over a network of workstations is used in solving a finite difference version of a one dimensional heat equation, where natural choice of PVM programming structure would be the master-slave paradigm, with the aim of finding an optimal configuration resulting in least computing time including communication overhead among machines. Given a set of PVM tasks comprised of one master and five slave programs, it is found that there exists a pseudo-optimal number of machines, which does not necessarily coincide with the number of tasks, that yields the best performance when the network is under a light usage. Increasing the number of machines beyond this optimal one does not improve computing performance since increase in communication overhead among the excess number of machines offsets the decrease in CPU time obtained by distributing the PVM tasks among these machines. However, when the network traffic is heavy, the results exhibit a more random characteristic that is explained by the random nature of data transfer time

  5. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  6. Mental workload during n-back task-quantified in the prefrontal cortex using fNIRS.

    Science.gov (United States)

    Herff, Christian; Heger, Dominic; Fortmann, Ole; Hennrich, Johannes; Putze, Felix; Schultz, Tanja

    2013-01-01

    When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g., interacting with the car navigation system while driving) it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs) are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to elicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user. The prefrontal cortex (PFC) plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS), a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3}) to induce different levels of workload, forcing subjects to continuously remember the last one, two, or three of rapidly changing items. Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload. Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online.

  7. Mental workload during n-back task - quantified in the prefrontal cortex using fNIRS

    Directory of Open Access Journals (Sweden)

    Christian eHerff

    2014-01-01

    Full Text Available When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g. interacting with the car navigation system while driving it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to illicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user.The prefrontal cortex (PFC plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS, a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3} to induce different levels of workload, forcing subjects to continuously remember the last one, two or three of rapidly changing items.Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload.Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online.

  8. Quantifying cross-linguistic influence with a computational model : A study of case-marking comprehension

    NARCIS (Netherlands)

    Matusevych, Yevgen; Alishahi, Afra; Backus, Albert

    2017-01-01

    Cross-linguistic influence (CLI) is one of the key phenomena in bilingual and second language learning. We propose a method for quantifying CLI in the use of linguistic constructions with the help of a computational model, which acquires constructions in two languages from bilingual input. We focus

  9. Optimizing the Number of Cooperating Terminals for Energy Aware Task Computing in Wireless Networks

    DEFF Research Database (Denmark)

    Olsen, Anders Brødløs; Fitzek, Frank H. P.; Koch, Peter

    2005-01-01

    It is generally accepted that energy consumption is a significant design constraint for mobile handheld systems, therefore motivations for methods optimizing the energy consumption making better use of the restricted battery resources are evident. A novel concept of distributed task computing...... is previously proposed (D2VS), where the overall idea of selective distribution of tasks among terminals is made. In this paper the optimal number of terminals for cooperative task computing in a wireless network will be investigated. The paper presents an energy model for the proposed scheme. Energy...... consumption of the terminals with respect to their workload and the overhead of distributing tasks among terminals are taken into account. The paper shows, that the number of cooperating terminals is in general limited to a few, though alternating with respect to the various system parameters....

  10. Quantifying tasks and roles in insect societies

    African Journals Online (AJOL)

    1991-05-15

    May 15, 1991 ... ergonomic selection, and was associated with the evolution of increasing behavioural ... The sequence in which tasks are perfonned by workers, .... space, providing a pictorial representation of the association between the ...

  11. Brain-computer interface analysis of a dynamic visuo-motor task.

    Science.gov (United States)

    Logar, Vito; Belič, Aleš

    2011-01-01

    The area of brain-computer interfaces (BCIs) represents one of the more interesting fields in neurophysiological research, since it investigates the development of the machines that perform different transformations of the brain's "thoughts" to certain pre-defined actions. Experimental studies have reported some successful implementations of BCIs; however, much of the field still remains unexplored. According to some recent reports the phase coding of informational content is an important mechanism in the brain's function and cognition, and has the potential to explain various mechanisms of the brain's data transfer, but it has yet to be scrutinized in the context of brain-computer interface. Therefore, if the mechanism of phase coding is plausible, one should be able to extract the phase-coded content, carried by brain signals, using appropriate signal-processing methods. In our previous studies we have shown that by using a phase-demodulation-based signal-processing approach it is possible to decode some relevant information on the current motor action in the brain from electroencephalographic (EEG) data. In this paper the authors would like to present a continuation of their previous work on the brain-information-decoding analysis of visuo-motor (VM) tasks. The present study shows that EEG data measured during more complex, dynamic visuo-motor (dVM) tasks carries enough information about the currently performed motor action to be successfully extracted by using the appropriate signal-processing and identification methods. The aim of this paper is therefore to present a mathematical model, which by means of the EEG measurements as its inputs predicts the course of the wrist movements as applied by each subject during the task in simulated or real time (BCI analysis). However, several modifications to the existing methodology are needed to achieve optimal decoding results and a real-time, data-processing ability. The information extracted from the EEG could

  12. A Multilevel Modeling Approach to Examining Individual Differences in Skill Acquisition for a Computer-Based Task

    OpenAIRE

    Nair, Sankaran N.; Czaja, Sara J.; Sharit, Joseph

    2007-01-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50–80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performan...

  13. Effects of four types of non-obtrusive feedback on computer behaviour, task performance and comfort

    NARCIS (Netherlands)

    Korte, E.M.; Huijsmans, M.A.; de Jong, A.M.; van de Ven, J.G.M.; Ruijsendaal, M.

    2012-01-01

    This study investigated the effects of non-obtrusive feedback on continuous lifted hand/finger behaviour, task performance and comfort. In an experiment with 24 participants the effects of two visual and two tactile feedback signals were compared to a no-feedback condition in a computer task.

  14. Optimization of Task Scheduling Algorithm through QoS Parameters for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Monika

    2016-01-01

    Full Text Available Cloud computing is an incipient innovation which broadly spreads among analysts. It furnishes clients with foundation, stage and programming as enhancement which is easily available by means of web. A cloud is a sort of parallel and conveyed framework comprising of a gathering of virtualized PCs that are utilized to execute various tasks to accomplish good execution time, accomplish due date and usage of its assets. The scheduling issue can be seen as the finding an ideal task of assignments over the accessible arrangement of assets with the goal that we can accomplish the wanted objectives for tasks. This paper presents an optimal algorithm for scheduling tasks to get their waiting time as a QoS parameter. The algorithm is simulated using Cloudsim simulator and experiments are carried out to help clients to make sense of the bottleneck of utilizing no. of virtual machine parallely.

  15. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  16. Impact of number of repeated scans on model observer performance for a low-contrast detection task in computed tomography.

    Science.gov (United States)

    Ma, Chi; Yu, Lifeng; Chen, Baiyu; Favazza, Christopher; Leng, Shuai; McCollough, Cynthia

    2016-04-01

    Channelized Hotelling observer (CHO) models have been shown to correlate well with human observers for several phantom-based detection/classification tasks in clinical computed tomography (CT). A large number of repeated scans were used to achieve an accurate estimate of the model's template. The purpose of this study is to investigate how the experimental and CHO model parameters affect the minimum required number of repeated scans. A phantom containing 21 low-contrast objects was scanned on a 128-slice CT scanner at three dose levels. Each scan was repeated 100 times. For each experimental configuration, the low-contrast detectability, quantified as the area under receiver operating characteristic curve, [Formula: see text], was calculated using a previously validated CHO with randomly selected subsets of scans, ranging from 10 to 100. Using [Formula: see text] from the 100 scans as the reference, the accuracy from a smaller number of scans was determined. Our results demonstrated that the minimum number of repeated scans increased when the radiation dose level decreased, object size and contrast level decreased, and the number of channels increased. As a general trend, it increased as the low-contrast detectability decreased. This study provides a basis for the experimental design of task-based image quality assessment in clinical CT using CHO.

  17. Visualizing stressful aspects of repetitive motion tasks and opportunities for ergonomic improvements using computer vision.

    Science.gov (United States)

    Greene, Runyu L; Azari, David P; Hu, Yu Hen; Radwin, Robert G

    2017-11-01

    Patterns of physical stress exposure are often difficult to measure, and the metrics of variation and techniques for identifying them is underdeveloped in the practice of occupational ergonomics. Computer vision has previously been used for evaluating repetitive motion tasks for hand activity level (HAL) utilizing conventional 2D videos. The approach was made practical by relaxing the need for high precision, and by adopting a semi-automatic approach for measuring spatiotemporal characteristics of the repetitive task. In this paper, a new method for visualizing task factors, using this computer vision approach, is demonstrated. After videos are made, the analyst selects a region of interest on the hand to track and the hand location and its associated kinematics are measured for every frame. The visualization method spatially deconstructs and displays the frequency, speed and duty cycle components of tasks that are part of the threshold limit value for hand activity for the purpose of identifying patterns of exposure associated with the specific job factors, as well as for suggesting task improvements. The localized variables are plotted as a heat map superimposed over the video, and displayed in the context of the task being performed. Based on the intensity of the specific variables used to calculate HAL, we can determine which task factors most contribute to HAL, and readily identify those work elements in the task that contribute more to increased risk for an injury. Work simulations and actual industrial examples are described. This method should help practitioners more readily measure and interpret temporal exposure patterns and identify potential task improvements. Copyright © 2017. Published by Elsevier Ltd.

  18. Nonoccurrence of Negotiation of Meaning in Task-Based Synchronous Computer-Mediated Communication

    Science.gov (United States)

    Van Der Zwaard, Rose; Bannink, Anne

    2016-01-01

    This empirical study investigated the occurrence of meaning negotiation in an interactive synchronous computer-mediated second language (L2) environment. Sixteen dyads (N = 32) consisting of nonnative speakers (NNSs) and native speakers (NSs) of English performed 2 different tasks using videoconferencing and written chat. The data were coded and…

  19. Metaheuristic Based Scheduling Meta-Tasks in Distributed Heterogeneous Computing Systems

    Directory of Open Access Journals (Sweden)

    Hesam Izakian

    2009-07-01

    Full Text Available Scheduling is a key problem in distributed heterogeneous computing systems in order to benefit from the large computing capacity of such systems and is an NP-complete problem. In this paper, we present a metaheuristic technique, namely the Particle Swarm Optimization (PSO algorithm, for this problem. PSO is a population-based search algorithm based on the simulation of the social behavior of bird flocking and fish schooling. Particles fly in problem search space to find optimal or near-optimal solutions. The scheduler aims at minimizing makespan, which is the time when finishes the latest task. Experimental studies show that the proposed method is more efficient and surpasses those of reported PSO and GA approaches for this problem.

  20. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  1. Correlations between Motor Symptoms across Different Motor Tasks, Quantified via Random Forest Feature Classification in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Andreas Kuhner

    2017-11-01

    Full Text Available BackgroundObjective assessments of Parkinson’s disease (PD patients’ motor state using motion capture techniques are still rarely used in clinical practice, even though they may improve clinical management. One major obstacle relates to the large dimensionality of motor abnormalities in PD. We aimed to extract global motor performance measures covering different everyday motor tasks, as a function of a clinical intervention, i.e., deep brain stimulation (DBS of the subthalamic nucleus.MethodsWe followed a data-driven, machine-learning approach and propose performance measures that employ Random Forests with probability distributions. We applied this method to 14 PD patients with DBS switched-off or -on, and 26 healthy control subjects performing the Timed Up and Go Test (TUG, the Functional Reach Test (FRT, a hand coordination task, walking 10-m straight, and a 90° curve.ResultsFor each motor task, a Random Forest identified a specific set of metrics that optimally separated PD off DBS from healthy subjects. We noted the highest accuracy (94.6% for standing up. This corresponded to a sensitivity of 91.5% to detect a PD patient off DBS, and a specificity of 97.2% representing the rate of correctly identified healthy subjects. We then calculated performance measures based on these sets of metrics and applied those results to characterize symptom severity in different motor tasks. Task-specific symptom severity measures correlated significantly with each other and with the Unified Parkinson’s Disease Rating Scale (UPDRS, part III, correlation of r2 = 0.79. Agreement rates between different measures ranged from 79.8 to 89.3%.ConclusionThe close correlation of PD patients’ various motor abnormalities quantified by different, task-specific severity measures suggests that these abnormalities are only facets of the underlying one-dimensional severity of motor deficits. The identification and characterization of this underlying motor deficit

  2. A language for data-parallel and task parallel programming dedicated to multi-SIMD computers. Contributions to hydrodynamic simulation with lattice gases

    International Nuclear Information System (INIS)

    Pic, Marc Michel

    1995-01-01

    Parallel programming covers task-parallelism and data-parallelism. Many problems need both parallelisms. Multi-SIMD computers allow hierarchical approach of these parallelisms. The T++ language, based on C++, is dedicated to exploit Multi-SIMD computers using a programming paradigm which is an extension of array-programming to tasks managing. Our language introduced array of independent tasks to achieve separately (MIMD), on subsets of processors of identical behaviour (SIMD), in order to translate the hierarchical inclusion of data-parallelism in task-parallelism. To manipulate in a symmetrical way tasks and data we propose meta-operations which have the same behaviour on tasks arrays and on data arrays. We explain how to implement this language on our parallel computer SYMPHONIE in order to profit by the locally-shared memory, by the hardware virtualization, and by the multiplicity of communications networks. We analyse simultaneously a typical application of such architecture. Finite elements scheme for Fluid mechanic needs powerful parallel computers and requires large floating points abilities. Lattice gases is an alternative to such simulations. Boolean lattice bases are simple, stable, modular, need to floating point computation, but include numerical noise. Boltzmann lattice gases present large precision of computation, but needs floating points and are only locally stable. We propose a new scheme, called multi-bit, who keeps the advantages of each boolean model to which it is applied, with large numerical precision and reduced noise. Experiments on viscosity, physical behaviour, noise reduction and spurious invariants are shown and implementation techniques for parallel Multi-SIMD computers detailed. (author) [fr

  3. Student Computer Use in Selected Undergraduate Agriculture Courses: An Examination of Required Tasks.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.

    2000-01-01

    Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)

  4. Respiratory sinus arrhythmia responses to cognitive tasks: effects of task factors and RSA indices.

    Science.gov (United States)

    Overbeek, Thérèse J M; van Boxtel, Anton; Westerink, Joyce H D M

    2014-05-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable results. In 83 healthy subjects, we studied RSA responses to a working memory task requiring varying levels of cognitive control and a perceptual attention task not requiring strong cognitive control. RSA responses were quantified in the time and frequency domain and were additionally corrected for differences in mean interbeat interval and respiration rate, resulting in eight different RSA indices. The two tasks were clearly differentiated by heart rate and facial EMG reference measures. Cognitive control induced inhibition of RSA whereas perceptual attention generally did not. However, the results show several differences between different RSA indices, emphasizing the importance of methodological variables. Age and sex did not influence the results. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    Science.gov (United States)

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  6. Development and evaluation of a computer-based medical work assessment programme

    Directory of Open Access Journals (Sweden)

    Spallek Michael

    2008-12-01

    Full Text Available Abstract Background There are several ways to conduct a job task analysis in medical work environments including pencil-paper observations, interviews and questionnaires. However these methods implicate bias problems such as high inter-individual deviations and risks of misjudgement. Computer-based observation helps to reduce these problems. The aim of this paper is to give an overview of the development process of a computer-based job task analysis instrument for real-time observations to quantify the job tasks performed by physicians working in different medical settings. In addition reliability and validity data of this instrument will be demonstrated. Methods This instrument was developed in consequential steps. First, lists comprising tasks performed by physicians in different care settings were classified. Afterwards content validity of task lists was proved. After establishing the final task categories, computer software was programmed and implemented in a mobile personal computer. At least inter-observer reliability was evaluated. Two trained observers recorded simultaneously tasks of the same physician. Results Content validity of the task lists was confirmed by observations and experienced specialists of each medical area. Development process of the job task analysis instrument was completed successfully. Simultaneous records showed adequate interrater reliability. Conclusion Initial results of this analysis supported the validity and reliability of this developed method for assessing physicians' working routines as well as organizational context factors. Based on results using this method, possible improvements for health professionals' work organisation can be identified.

  7. Thinking processes used by high-performing students in a computer programming task

    Directory of Open Access Journals (Sweden)

    Marietjie Havenga

    2011-07-01

    Full Text Available Computer programmers must be able to understand programming source code and write programs that execute complex tasks to solve real-world problems. This article is a trans- disciplinary study at the intersection of computer programming, education and psychology. It outlines the role of mental processes in the process of programming and indicates how successful thinking processes can support computer science students in writing correct and well-defined programs. A mixed methods approach was used to better understand the thinking activities and programming processes of participating students. Data collection involved both computer programs and students’ reflective thinking processes recorded in their journals. This enabled analysis of psychological dimensions of participants’ thinking processes and their problem-solving activities as they considered a programming problem. Findings indicate that the cognitive, reflective and psychological processes used by high-performing programmers contributed to their success in solving a complex programming problem. Based on the thinking processes of high performers, we propose a model of integrated thinking processes, which can support computer programming students. Keywords: Computer programming, education, mixed methods research, thinking processes.  Disciplines: Computer programming, education, psychology

  8. Respiratory sinus arrhythmia responses to cognitive tasks : effects of task factors and RSA indices

    NARCIS (Netherlands)

    Overbeek, T.; Boxtel, van Anton; Westerink, J.H.D.M.

    2014-01-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable

  9. Respiratory sinus arrhythmia responses to cognitive tasks: Effects of task factors and RSA indices

    NARCIS (Netherlands)

    Overbeek, T.J.M.; van Boxtel, A.; Westerink, J.H.D.M.

    2014-01-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable

  10. The employment of a spoken language computer applied to an air traffic control task.

    Science.gov (United States)

    Laveson, J. I.; Silver, C. A.

    1972-01-01

    Assessment of the merits of a limited spoken language (56 words) computer in a simulated air traffic control (ATC) task. An airport zone approximately 60 miles in diameter with a traffic flow simulation ranging from single-engine to commercial jet aircraft provided the workload for the controllers. This research determined that, under the circumstances of the experiments carried out, the use of a spoken-language computer would not improve the controller performance.

  11. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    Science.gov (United States)

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  12. Beyond Behavioral Inhibition: A Computer Avatar Task Designed to Assess Behavioral Inhibition Extends to Harm Avoidance

    Directory of Open Access Journals (Sweden)

    Michael Todd Allen

    2017-09-01

    Full Text Available Personality factors such as behavioral inhibition (BI, a temperamental tendency for avoidance in the face of unfamiliar situations, have been identified as risk factors for anxiety disorders. Personality factors are generally identified through self-report inventories. However, this tendency to avoid may affect the accuracy of these self-report inventories. Previously, a computer based task was developed in which the participant guides an on-screen “avatar” through a series of onscreen events; performance on the task could accurately predict participants’ BI, measured by a standard paper and pencil questionnaire (Adult Measure of Behavioral Inhibition, or AMBI. Here, we sought to replicate this finding as well as compare performance on the avatar task to another measure related to BI, the harm avoidance (HA scale of the Tridimensional Personality Questionnaire (TPQ. The TPQ includes HA scales as well as scales assessing reward dependence (RD, novelty seeking (NS and persistence. One hundred and one undergraduates voluntarily completed the avatar task and the paper and pencil inventories in a counter-balanced order. Scores on the avatar task were strongly correlated with BI assessed via the AMBI questionnaire, which replicates prior findings. Females exhibited higher HA scores than males, but did not differ on scores on the avatar task. There was a strong positive relationship between scores on the avatar task and HA scores. One aspect of HA, fear of uncertainty was found to moderately mediate the relationship between AMBI scores and avatar scores. NS had a strong negative relationship with scores on the avatar task, but there was no significant relationship between RD and scores on the avatar task. These findings indicate the effectiveness of the avatar task as a behavioral alternative to self-report measures to assess avoidance. In addition, the use of computer based behavioral tasks are a viable alternative to paper and pencil self

  13. Service task partition and distribution in star topology computer grid subject to data security constraints

    Energy Technology Data Exchange (ETDEWEB)

    Xiang Yanping [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Levitin, Gregory, E-mail: levitin@iec.co.il [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Israel electric corporation, P. O. Box 10, Haifa 31000 (Israel)

    2011-11-15

    The paper considers grid computing systems in which the resource management systems (RMS) can divide service tasks into execution blocks (EBs) and send these blocks to different resources. In order to provide a desired level of service reliability the RMS can assign the same blocks to several independent resources for parallel execution. The data security is a crucial issue in distributed computing that affects the execution policy. By the optimal service task partition into the EBs and their distribution among resources, one can achieve the greatest possible service reliability and/or expected performance subject to data security constraints. The paper suggests an algorithm for solving this optimization problem. The algorithm is based on the universal generating function technique and on the evolutionary optimization approach. Illustrative examples are presented. - Highlights: > Grid service with star topology is considered. > An algorithm for evaluating service reliability and data security is presented. > A tradeoff between the service reliability and data security is analyzed. > A procedure for optimal service task partition and distribution is suggested.

  14. Service task partition and distribution in star topology computer grid subject to data security constraints

    International Nuclear Information System (INIS)

    Xiang Yanping; Levitin, Gregory

    2011-01-01

    The paper considers grid computing systems in which the resource management systems (RMS) can divide service tasks into execution blocks (EBs) and send these blocks to different resources. In order to provide a desired level of service reliability the RMS can assign the same blocks to several independent resources for parallel execution. The data security is a crucial issue in distributed computing that affects the execution policy. By the optimal service task partition into the EBs and their distribution among resources, one can achieve the greatest possible service reliability and/or expected performance subject to data security constraints. The paper suggests an algorithm for solving this optimization problem. The algorithm is based on the universal generating function technique and on the evolutionary optimization approach. Illustrative examples are presented. - Highlights: → Grid service with star topology is considered. → An algorithm for evaluating service reliability and data security is presented. → A tradeoff between the service reliability and data security is analyzed. → A procedure for optimal service task partition and distribution is suggested.

  15. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L. [Oak Ridge National Lab., TN (United States); Sartori, E. [OCDE/OECD NEA Data Bank, Issy-les-Moulineaux (France); Viedma, L.G. de [Consejo de Seguridad Nuclear, Madrid (Spain)

    1997-06-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee`s Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community`s computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management.

  16. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    International Nuclear Information System (INIS)

    Kirk, B.L.; Sartori, E.; Viedma, L.G. de

    1997-01-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee's Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community's computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management

  17. One Task, Divergent Solutions: High- versus Low-Status Sources and Social Comparison Guide Adaptation in a Computer-Supported Socio-Cognitive Conflict Task

    Science.gov (United States)

    Baumeister, Antonia E.; Engelmann, Tanja; Hesse, Friedrich W.

    2017-01-01

    This experimental study extends conflict elaboration theory (1) by revealing social influence dynamics for a knowledge-rich computer-supported socio-cognitive conflict task not investigated in the context of this theory before and (2) by showing the impact of individual differences in social comparison orientation. Students in two conditions…

  18. Unsupervised segmentation of task activated regions in fmRI

    DEFF Research Database (Denmark)

    Røge, Rasmus; Madsen, Kristoffer Hougaard; Schmidt, Mikkel Nørgaard

    2015-01-01

    Functional Magnetic Resonance Imaging has become a central measuring modality to quantify functional activiation of the brain in both task and rest. Most analysis used to quantify functional activation requires supervised approaches as employed in statistical parametric mapping (SPM) to extract...... framework for the analysis of task fMRI and resting-state data in general where strong knowledge of how the task induces a BOLD response is missing....

  19. Use of Computer-Aided Tomography (CT) Imaging for Quantifying Coarse Roots, Rhizomes, Peat, and Particle Densities in Marsh Soils

    Science.gov (United States)

    Computer-aided Tomography (CT) imaging was utilized to quantify wet mass of coarse roots, rhizomes, and peat in cores collected from organic-rich (Jamaica Bay, NY) and mineral (North Inlet, SC) Spartina alterniflora soils. Calibration rods composed of materials with standard dens...

  20. From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task

    Science.gov (United States)

    Öman, Anne

    2017-01-01

    Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…

  1. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals

    Science.gov (United States)

    Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.

  2. A computational model of the fetal circulation to quantify blood redistribution in intrauterine growth restriction.

    Directory of Open Access Journals (Sweden)

    Patricia Garcia-Canadilla

    2014-06-01

    Full Text Available Intrauterine growth restriction (IUGR due to placental insufficiency is associated with blood flow redistribution in order to maintain delivery of oxygenated blood to the brain. Given that, in the fetus the aortic isthmus (AoI is a key arterial connection between the cerebral and placental circulations, quantifying AoI blood flow has been proposed to assess this brain sparing effect in clinical practice. While numerous clinical studies have studied this parameter, fundamental understanding of its determinant factors and its quantitative relation with other aspects of haemodynamic remodeling has been limited. Computational models of the cardiovascular circulation have been proposed for exactly this purpose since they allow both for studying the contributions from isolated parameters as well as estimating properties that cannot be directly assessed from clinical measurements. Therefore, a computational model of the fetal circulation was developed, including the key elements related to fetal blood redistribution and using measured cardiac outflow profiles to allow personalization. The model was first calibrated using patient-specific Doppler data from a healthy fetus. Next, in order to understand the contributions of the main parameters determining blood redistribution, AoI and middle cerebral artery (MCA flow changes were studied by variation of cerebral and peripheral-placental resistances. Finally, to study how this affects an individual fetus, the model was fitted to three IUGR cases with different degrees of severity. In conclusion, the proposed computational model provides a good approximation to assess blood flow changes in the fetal circulation. The results support that while MCA flow is mainly determined by a fall in brain resistance, the AoI is influenced by a balance between increased peripheral-placental and decreased cerebral resistances. Personalizing the model allows for quantifying the balance between cerebral and peripheral

  3. Microsyntactic Annotation of Corpora and its Use in Computational Linguistics Tasks

    Directory of Open Access Journals (Sweden)

    Iomdin Leonid

    2017-12-01

    Full Text Available Microsyntax is a linguistic discipline dealing with idiomatic elements whose important properties are strongly related to syntax. In a way, these elements may be viewed as transitional entities between the lexicon and the grammar, which explains why they are often underrepresented in both of these resource types: the lexicographer fails to see such elements as full-fledged lexical units, while the grammarian finds them too specific to justify the creation of individual well-developed rules. As a result, such elements are poorly covered by linguistic models used in advanced modern computational linguistic tasks like high-quality machine translation or deep semantic analysis. A possible way to mend the situation and improve the coverage and adequate treatment of microsyntactic units in linguistic resources is to develop corpora with microsyntactic annotation, closely linked to specially designed lexicons. The paper shows how this task is solved in the deeply annotated corpus of Russian, SynTagRus.

  4. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

    Directory of Open Access Journals (Sweden)

    Andrea Stocco

    2018-04-01

    Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  5. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Science.gov (United States)

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  6. The effect of dynamic workstations on the performance of various computer and office-based tasks

    NARCIS (Netherlands)

    Burford, E.M.; Botter, J.; Commissaris, D.; Könemann, R.; Hiemstra-Van Mastrigt, S.; Ellegast, R.P.

    2013-01-01

    The effect of different workstations, conventional and dynamic, on different types of performance measures for several different office and computer based task was investigated in this research paper. The two dynamic workstations assessed were the Lifespan Treadmill Desk and the RightAngle

  7. The impact of goal-oriented task design on neurofeedback learning for brain-computer interface control.

    Science.gov (United States)

    McWhinney, S R; Tremblay, A; Boe, S G; Bardouille, T

    2018-02-01

    Neurofeedback training teaches individuals to modulate brain activity by providing real-time feedback and can be used for brain-computer interface control. The present study aimed to optimize training by maximizing engagement through goal-oriented task design. Participants were shown either a visual display or a robot, where each was manipulated using motor imagery (MI)-related electroencephalography signals. Those with the robot were instructed to quickly navigate grid spaces, as the potential for goal-oriented design to strengthen learning was central to our investigation. Both groups were hypothesized to show increased magnitude of these signals across 10 sessions, with the greatest gains being seen in those navigating the robot due to increased engagement. Participants demonstrated the predicted increase in magnitude, with no differentiation between hemispheres. Participants navigating the robot showed stronger left-hand MI increases than those with the computer display. This is likely due to success being reliant on maintaining strong MI-related signals. While older participants showed stronger signals in early sessions, this trend later reversed, suggesting greater natural proficiency but reduced flexibility. These results demonstrate capacity for modulating neurofeedback using MI over a series of training sessions, using tasks of varied design. Importantly, the more goal-oriented robot control task resulted in greater improvements.

  8. Motivation and performance within a collaborative computer-based modeling task: Relations between student's achievement goal orientation, self-efficiacy, cognitive processing and achievement.

    NARCIS (Netherlands)

    Sins, Patrick H.M.; van Joolingen, Wouter; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In

  9. Motivation and performance within a collaborative computer-based modeling task: Relations between students' achievement goal orientation, self-efficacy, cognitive processing and achievement

    NARCIS (Netherlands)

    Sins, P.H.M.; van Joolingen, W.R.; Savelsbergh, E.R.; van Hout-Wolters, B.H.A.M.

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In

  10. Selecting Tasks for Evaluating Human Performance as a Function of Gravity

    Science.gov (United States)

    Norcross, Jason R.; Gernhardt, Michael L.

    2011-01-01

    A challenge in understanding human performance as a function of gravity is determining which tasks to research. Initial studies began with treadmill walking, which was easy to quantify and control. However, with the development of pressurized rovers, it is less important to optimize human performance for ambulation as pressurized rovers will likely perform gross translation for them. Future crews are likely to spend much of their extravehicular activity (EVA) performing geology, construction,a nd maintenance type tasks. With these types of tasks, people have different performance strategies, and it is often difficult to quantify the task and measure steady-state metabolic rates or perform biomechanical analysis. For many of these types of tasks, subjective feedback may be the only data that can be collected. However, subjective data may not fully support a rigorous scientific comparison of human performance across different gravity levels and suit factors. NASA would benefit from having a wide variety of quantifiable tasks that allow human performance comparison across different conditions. In order to determine which tasks will effectively support scientific studies, many different tasks and data analysis techniques will need to be employed. Many of these tasks and techniques will not be effective, but some will produce quantifiable results that are sensitive enough to show performance differences. One of the primary concerns related to EVA performance is metabolic rate. The higher the metabolic rate, the faster the astronaut will exhaust consumables. The focus of this poster will be on how different tasks affect metabolic rate across different gravity levels.

  11. A multilevel modeling approach to examining individual differences in skill acquisition for a computer-based task.

    Science.gov (United States)

    Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph

    2007-06-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.

  12. A novel task-oriented optimal design for P300-based brain-computer interfaces.

    Science.gov (United States)

    Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen

    2014-10-01

    Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.

  13. The effects of stimulus modality and task integrality: Predicting dual-task performance and workload from single-task levels

    Science.gov (United States)

    Hart, S. G.; Shively, R. J.; Vidulich, M. A.; Miller, R. C.

    1986-01-01

    The influence of stimulus modality and task difficulty on workload and performance was investigated. The goal was to quantify the cost (in terms of response time and experienced workload) incurred when essentially serial task components shared common elements (e.g., the response to one initiated the other) which could be accomplished in parallel. The experimental tasks were based on the Fittsberg paradigm; the solution to a SternBERG-type memory task determines which of two identical FITTS targets are acquired. Previous research suggested that such functionally integrated dual tasks are performed with substantially less workload and faster response times than would be predicted by suming single-task components when both are presented in the same stimulus modality (visual). The physical integration of task elements was varied (although their functional relationship remained the same) to determine whether dual-task facilitation would persist if task components were presented in different sensory modalities. Again, it was found that the cost of performing the two-stage task was considerably less than the sum of component single-task levels when both were presented visually. Less facilitation was found when task elements were presented in different sensory modalities. These results suggest the importance of distinguishing between concurrent tasks that complete for limited resources from those that beneficially share common resources when selecting the stimulus modalities for information displays.

  14. Improving our understanding of multi-tasking in healthcare: Drawing together the cognitive psychology and healthcare literature.

    Science.gov (United States)

    Douglas, Heather E; Raban, Magdalena Z; Walter, Scott R; Westbrook, Johanna I

    2017-03-01

    Multi-tasking is an important skill for clinical work which has received limited research attention. Its impacts on clinical work are poorly understood. In contrast, there is substantial multi-tasking research in cognitive psychology, driver distraction, and human-computer interaction. This review synthesises evidence of the extent and impacts of multi-tasking on efficiency and task performance from health and non-healthcare literature, to compare and contrast approaches, identify implications for clinical work, and to develop an evidence-informed framework for guiding the measurement of multi-tasking in future healthcare studies. The results showed healthcare studies using direct observation have focused on descriptive studies to quantify concurrent multi-tasking and its frequency in different contexts, with limited study of impact. In comparison, non-healthcare studies have applied predominantly experimental and simulation designs, focusing on interleaved and concurrent multi-tasking, and testing theories of the mechanisms by which multi-tasking impacts task efficiency and performance. We propose a framework to guide the measurement of multi-tasking in clinical settings that draws together lessons from these siloed research efforts. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    task (Bahrami et al 2010, Fusaroli et al. 2012) we extend to linguistic coordination dynamical measures of recurrence employed in the analysis of sensorimotor coordination (such as heart-rate (Konvalinka et al 2011), postural sway (Shockley 2005) and eye-movements (Dale, Richardson and Kirkham 2012......). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities...

  16. Connected Car: Quantified Self becomes Quantified Car

    Directory of Open Access Journals (Sweden)

    Melanie Swan

    2015-02-01

    Full Text Available The automotive industry could be facing a situation of profound change and opportunity in the coming decades. There are a number of influencing factors such as increasing urban and aging populations, self-driving cars, 3D parts printing, energy innovation, and new models of transportation service delivery (Zipcar, Uber. The connected car means that vehicles are now part of the connected world, continuously Internet-connected, generating and transmitting data, which on the one hand can be helpfully integrated into applications, like real-time traffic alerts broadcast to smartwatches, but also raises security and privacy concerns. This paper explores the automotive connected world, and describes five killer QS (Quantified Self-auto sensor applications that link quantified-self sensors (sensors that measure the personal biometrics of individuals like heart rate and automotive sensors (sensors that measure driver and passenger biometrics or quantitative automotive performance metrics like speed and braking activity. The applications are fatigue detection, real-time assistance for parking and accidents, anger management and stress reduction, keyless authentication and digital identity verification, and DIY diagnostics. These kinds of applications help to demonstrate the benefit of connected world data streams in the automotive industry and beyond where, more fundamentally for human progress, the automation of both physical and now cognitive tasks is underway.

  17. Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track

    Science.gov (United States)

    2015-11-20

    Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track Paul N. Bennett Microsoft Research Redmond, USA pauben...anchor text graph has proven useful in the general realm of query reformulation [2], we sought to quantify the value of extracting key phrases from...anchor text in the broader setting of the task understanding track. Given a query, our approach considers a simple method for identifying a relevant

  18. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  19. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  20. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    Science.gov (United States)

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  1. Human factors with nonhumans - Factors that affect computer-task performance

    Science.gov (United States)

    Washburn, David A.

    1992-01-01

    There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

  2. Computer-mediated communication and time pressure induce higher cardiovascular responses in the preparatory and execution phases of cooperative tasks.

    Science.gov (United States)

    Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia

    2010-11-01

    The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.

  3. Quality assurance for computed-tomography simulators and the computed-tomography-simulation process: Report of the AAPM Radiation Therapy Committee Task Group No. 66

    International Nuclear Information System (INIS)

    Mutic, Sasa; Palta, Jatinder R.; Butker, Elizabeth K.; Das, Indra J.; Huq, M. Saiful; Loo, Leh-Nien Dick; Salter, Bill J.; McCollough, Cynthia H.; Van Dyk, Jacob

    2003-01-01

    This document presents recommendations of the American Association of Physicists in Medicine (AAPM) for quality assurance of computed-tomography- (CT) simulators and CT-simulation process. This report was prepared by Task Group No. 66 of the AAPM Radiation Therapy Committee. It was approved by the Radiation Therapy Committee and by the AAPM Science Council

  4. Computer-mediated communication: task performance and satisfaction.

    Science.gov (United States)

    Simon, Andrew F

    2006-06-01

    The author assessed satisfaction and performance on 3 tasks (idea generation, intellective, judgment) among 75 dyads (N = 150) working through 1 of 3 modes of communication (instant messaging, videoconferencing, face to face). The author based predictions on the Media Naturalness Theory (N. Kock, 2001, 2002) and on findings from past researchers (e.g., D. M. DeRosa, C. Smith, & D. A. Hantula, in press) of the interaction between tasks and media. The present author did not identify task performance differences, although satisfaction with the medium was lower among those dyads communicating through an instant-messaging system than among those interacting face to face or through videoconferencing. The findings support the Media Naturalness Theory. The author discussed them in relation to the participants' frequent use of instant messaging and their familiarity with new communication media.

  5. Objective threshold for distinguishing complicated tasks

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    Estimating the likelihood of human error in a reliable manner is really important for enhancing the safety of a large process control system such as Nuclear Power Plants (NPPs). In this regard, from the point of view of Probabilistic Safety Assessment (PSA), various kinds of Human Reliability Analysis (HRA) methods have been used for several decades in order to systematically evaluate the effect of human error on the safety of NPPs. However, one of the recurrence issues is to determine the level of an important Performance Shaping Factor (PSF) by using a clear and objective manner with respect to the context of a given task. Unfortunately, there is no such criterion for a certain PSF such as the complexity of a task. For this reason, in this study, an objective criterion that is helpful for identifying a complicated task is suggested based on the Task Complexity (TACOM) measure. To this end, subjective difficulty scores rated by high speed train drivers are collected. After that, subjective difficulty scores are compared with the associated TACOM scores being quantified based on tasks to be conducted by high speed train drivers. As a result, it is expected that high speed train drivers feel a significant difficulty when they are faced with tasks of which the TACOM scores are greater than 4.2. Since TACOM measure is a kind of general tool to quantify the complexity of tasks to be done by human operators, it is promising to conclude that this value can be regarded as a common threshold representing what a complicated task is.

  6. Job Management and Task Bundling

    Directory of Open Access Journals (Sweden)

    Berkowitz Evan

    2018-01-01

    Full Text Available High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users’ current workflows or executables.

  7. Job Management and Task Bundling

    Science.gov (United States)

    Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André

    2018-03-01

    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.

  8. Task Uncertainty Can Account for Mixing and Switch Costs in Task-Switching

    Science.gov (United States)

    Rennie, Jaime L.

    2015-01-01

    Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate), particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment. PMID:26107646

  9. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  10. Quantifying the visual appearance of sunscreens applied to the skin using indirect computer image colorimetry.

    Science.gov (United States)

    Richer, Vincent; Kharazmi, Pegah; Lee, Tim K; Kalia, Sunil; Lui, Harvey

    2018-03-01

    There is no accepted method to objectively assess the visual appearance of sunscreens on the skin. We present a method for sunscreen application, digital photography, and computer analysis to quantify the appearance of the skin after sunscreen application. Four sunscreen lotions were applied randomly at densities of 0.5, 1.0, 1.5, and 2.0 mg/cm 2 to areas of the back of 29 subjects. Each application site had a matched contralateral control area. High-resolution standardized photographs including a color card were taken after sunscreen application. After color balance correction, CIE L*a*b* color values were extracted from paired sites. Differences in skin appearance attributed to sunscreen were represented by ΔE, which in turn was calculated from the linear Euclidean distance within the L*a*b* color space between the paired sites. Sunscreen visibility as measured by median ΔE varied across different products and application densities and ranged between 1.2 and 12.1. The visibility of sunscreens varied according to product SPF, composition (organic vs inorganic), presence of tint, and baseline b* of skin (P colorimetry represents a potential method to objectively quantify visibility of sunscreen on the skin. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Quantifying the distribution of paste-void spacing of hardened cement paste using X-ray computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Tae Sup, E-mail: taesup@yonsei.ac.kr [School of Civil and Environmental Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul, 120-749 (Korea, Republic of); Kim, Kwang Yeom, E-mail: kimky@kict.re.kr [Korea Institute of Construction Technology, 283 Goyangdae-ro, Ilsanseo-gu, Goyang, 411-712 (Korea, Republic of); Choo, Jinhyun, E-mail: jinhyun@stanford.edu [Department of Civil and Environmental Engineering, Stanford University, Stanford, CA 94305 (United States); Kang, Dong Hun, E-mail: timeriver@naver.com [School of Civil and Environmental Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul, 120-749 (Korea, Republic of)

    2012-11-15

    The distribution of paste-void spacing in cement-based materials is an important feature related to the freeze-thaw durability of these materials, but its reliable estimation remains an unresolved problem. Herein, we evaluate the capability of X-ray computed tomography (CT) for reliable quantification of the distribution of paste-void spacing. Using X-ray CT images of three mortar specimens having different air-entrainment characteristics, we calculate the distributions of paste-void spacing of the specimens by applying previously suggested methods for deriving the exact spacing of air-void systems. This methodology is assessed by comparing the 95th percentile of the cumulative distribution function of the paste-void spacing with spacing factors computed by applying the linear-traverse method to 3D air-void system and reconstructing equivalent air-void distribution in 3D. Results show that the distributions of equivalent void diameter and paste-void spacing follow lognormal and normal distributions, respectively, and the ratios between the 95th percentile paste-void spacing value and the spacing factors reside within the ranges reported by previous numerical studies. This experimental finding indicates that the distribution of paste-void spacing quantified using X-ray CT has the potential to be the basis for a statistical assessment of the freeze-thaw durability of cement-based materials. - Highlights: Black-Right-Pointing-Pointer The paste-void spacing in 3D can be quantified by X-ray CT. Black-Right-Pointing-Pointer The distribution of the paste-void spacing follows normal distribution. Black-Right-Pointing-Pointer The spacing factor and 95th percentile of CDF of paste-void spacing are correlated.

  12. Motivation and performance within a collaborative computer-based modeling task: Relations between students' achievement goal orientation, self-efficacy, cognitive processing and achievement

    OpenAIRE

    Sins, P.H.M.; van Joolingen, W.R.; Savelsbergh, E.R.; van Hout-Wolters, B.H.A.M.

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of mastery-approach goal orientation, performance-avoidance goal orientation, self-efficacy, and achievement were employed. Students’ cognitive processing was a...

  13. Model Checking Quantified Computation Tree Logic

    NARCIS (Netherlands)

    Rensink, Arend; Baier, C; Hermanns, H.

    2006-01-01

    Propositional temporal logic is not suitable for expressing properties on the evolution of dynamically allocated entities over time. In particular, it is not possible to trace such entities through computation steps, since this requires the ability to freely mix quantification and temporal

  14. Waiting is the hardest part: comparison of two computational strategies for performing a compelled-response task

    Directory of Open Access Journals (Sweden)

    Emilio Salinas

    2010-12-01

    Full Text Available The neural basis of choice behavior is commonly investigated with tasks in which a subject analyzes a stimulus and reports his or her perceptual experience with an appropriate motor action. We recently developed a novel task, the compelled-saccade task, with which the influence of the sensory information on the subject's choice can be tracked through time with millisecond resolution, thus providing a new tool for correlating neuronal activity and behavior. This paradigm has a crucial feature: the signal that instructs the subject to make an eye movement is given before the cue that indicates which of two possible choices is the correct one. Previously, we found that psychophysical performance in this task could be accurately replicated by a model in which two developing oculomotor plans race to a threshold and the incoming perceptual information differentially accelerates their trajectories toward it. However, the task design suggests an alternative mechanism: instead of modifying an ongoing oculomotor plan on the fly as the sensory information becomes available, the subject could try to wait, withholding the oculomotor response until the sensory cue is revealed. Here, we use computer simulations to explore and compare the performance of these two types of models. We find that both reproduce the main features of the psychophysical data in the compelled-saccade task, but they give rise to distinct behavioral and neurophysiological predictions. Although, superficially, the waiting model is intuitively appealing, it is ultimately inconsistent with experimental results from this and other tasks.

  15. Validating the appropriateness of TACOM measure: Comparing TACOM scores with subjective workload scores quantified by NASA-TLX technique

    International Nuclear Information System (INIS)

    Park, J.; Jung, W.

    2006-01-01

    In this study, the appropriateness of the task complexity (TACOM) measure that can quantify the complexity of emergency tasks was investigated by comparing subjective workload scores with the associated TACOM scores. To this end, based on the NASA-TLX (task load index) technique, 18 operators were asked to subjectively estimate perceived workload for 23 emergency tasks that were specified in the emergency operating procedures of the reference nuclear power plants. As the result of comparisons, it was observed that subjective workload scores increase in proportion to the increase of TACOM scores. Therefore, it is expect that the TACOM measure can be used as a serviceable method to quantify the complexity of emergency tasks. (authors)

  16. Validating the appropriateness of TACOM measure: Comparing TACOM scores with subjective workload scores quantified by NASA-TLX technique

    Energy Technology Data Exchange (ETDEWEB)

    Park, J.; Jung, W. [Integrated Safety Assessment Div., Korea Atomic Energy Research Inst., P.O.Box 105, Duckjin-Dong, Yusong-Ku, Taejon, 305-600 (Korea, Republic of)

    2006-07-01

    In this study, the appropriateness of the task complexity (TACOM) measure that can quantify the complexity of emergency tasks was investigated by comparing subjective workload scores with the associated TACOM scores. To this end, based on the NASA-TLX (task load index) technique, 18 operators were asked to subjectively estimate perceived workload for 23 emergency tasks that were specified in the emergency operating procedures of the reference nuclear power plants. As the result of comparisons, it was observed that subjective workload scores increase in proportion to the increase of TACOM scores. Therefore, it is expect that the TACOM measure can be used as a serviceable method to quantify the complexity of emergency tasks. (authors)

  17. Quantifier spreading in child eye movements: A case of the Russian quantifier kazhdyj ‘every'

    Directory of Open Access Journals (Sweden)

    Irina A. Sekerina

    2017-07-01

    Full Text Available Extensive cross-linguistic work has documented that children up to the age of 9–10 make errors when performing a sentence-picture verification task that pairs spoken sentences with the universal quantifier 'every 'and pictures with entities in partial one-to-one correspondence. These errors stem from children’s difficulties in restricting the domain of a universal quantifier to the appropriate noun phrase and are referred in the literature as 'quantifier-spreading '('q'-spreading. We adapted the task to be performed in conjunction with eye-movement recordings using the Visual World Paradigm. Russian-speaking 5-to-6-year-old children ('N '= 31 listened to sentences like 'Kazhdyj alligator lezhit v vanne '‘Every alligator is lying in a bathtub’ and viewed pictures with three alligators, each in a bathtub, and two extra empty bathtubs. Non-spreader children ('N '= 12 were adult-like in their accuracy whereas 'q'-spreading ones ('N '= 19 were only 43% correct in interpreting such sentences compared to the control sentences. Eye movements of 'q'-spreading children revealed that more looks to the extra containers (two empty bathtubs correlated with higher error rates reflecting the processing pattern of 'q'-spreading. In contrast, more looks to the distractors in control sentences did not lead to errors in interpretation. We argue that 'q'-spreading errors are caused by interference from the extra entities in the visual context, and our results support the processing difficulty account of acquisition of quantification. Interference results in cognitive overload as children have to integrate multiple sources of information, i.e., visual context with salient extra entities and the spoken sentence in which these entities are mentioned in real-time processing.   This article is part of the special collection: Acquisition of Quantification

  18. Learning Style and Task Performance in Synchronous Computer-Mediated Communication: A Case Study of Iranian EFL Learners

    Science.gov (United States)

    Hedayati, Mohsen; Foomani, Elham Mohammadi

    2015-01-01

    The study reported here explores whether English as a foreign Language (EFL) learners' preferred ways of learning (i.e., learning styles) affect their task performance in computer-mediated communication (CMC). As Ellis (2010) points out, while the increasing use of different sorts of technology is witnessed in language learning contexts, it is…

  19. Task uncertainty can account for mixing and switch costs in task-switching.

    Directory of Open Access Journals (Sweden)

    Patrick S Cooper

    Full Text Available Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate, particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment.

  20. Quantum tasks in Minkowski space

    International Nuclear Information System (INIS)

    Kent, Adrian

    2012-01-01

    The fundamental properties of quantum information and its applications to computing and cryptography have been greatly illuminated by considering information-theoretic tasks that are provably possible or impossible within non-relativistic quantum mechanics. I describe here a general framework for defining tasks within (special) relativistic quantum theory and illustrate it with examples from relativistic quantum cryptography and relativistic distributed quantum computation. The framework gives a unified description of all tasks previously considered and also defines a large class of new questions about the properties of quantum information in relation to Minkowski causality. It offers a way of exploring interesting new fundamental tasks and applications, and also highlights the scope for a more systematic understanding of the fundamental information-theoretic properties of relativistic quantum theory. (paper)

  1. Adaptive Cost-Based Task Scheduling in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Mohammed A. S. Mosleh

    2016-01-01

    Full Text Available Task execution in cloud computing requires obtaining stored data from remote data centers. Though this storage process reduces the memory constraints of the user’s computer, the time deadline is a serious concern. In this paper, Adaptive Cost-based Task Scheduling (ACTS is proposed to provide data access to the virtual machines (VMs within the deadline without increasing the cost. ACTS considers the data access completion time for selecting the cost effective path to access the data. To allocate data access paths, the data access completion time is computed by considering the mean and variance of the network service time and the arrival rate of network input/output requests. Then the task priority is assigned to the removed tasks based data access time. Finally, the cost of data paths are analyzed and allocated based on the task priority. Minimum cost path is allocated to the low priority tasks and fast access path are allocated to high priority tasks as to meet the time deadline. Thus efficient task scheduling can be achieved by using ACTS. The experimental results conducted in terms of execution time, computation cost, communication cost, bandwidth, and CPU utilization prove that the proposed algorithm provides better performance than the state-of-the-art methods.

  2. Teaching Sustainable Process Design Using 12 Systematic Computer-Aided Tasks

    DEFF Research Database (Denmark)

    Babi, Deenesh K.

    2015-01-01

    (tasks 4-7) and then sizing, costing and economic analysis of the designed process (tasks 8-9). This produces a base case design. In tasks 10-12, the student explores opportunities for heat and/or mass integration, followed by a sustainability analysis, in order to evaluate the base case design and set......In this paper a task-based approach for teaching (sustainable) process design to students pursuing a degree in chemical and biochemical engineering is presented. In tasks 1-3 the student makes design decisions for product and process selection followed by simple and rigorous model simulations...... targets for further improvement. Finally, a process optimization problem is formulated and solved to obtain the more sustainable process design. The 12 tasks are explained in terms of input and output of each task and examples of application of this approach in an MSclevel course are reported....

  3. Computing the Quartet Distance Between Evolutionary Trees in Time O(n log n)

    DEFF Research Database (Denmark)

    Brodal, Gerth Sølfting; Fagerberg, Rolf; Pedersen, Christian Nørgaard Storm

    2003-01-01

    Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, McMorris, and ...... unrooted evolutionary trees of n species, where all internal nodes have degree three, in time O(n log n. The previous best algorithm for the problem uses time O(n 2).......Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, Mc......Morris, and Meacham. The quartet distance between two unrooted evolutionary trees is the number of quartet topology differences between the two trees, where a quartet topology is the topological subtree induced by four species. In this paper we present an algorithm for computing the quartet distance between two...

  4. Efficient task assignment in spatial crowdsourcing with worker and task privacy protection

    KAUST Repository

    Liu, An

    2017-08-01

    Spatial crowdsourcing (SC) outsources tasks to a set of workers who are required to physically move to specified locations and accomplish tasks. Recently, it is emerging as a promising tool for emergency management, as it enables efficient and cost-effective collection of critical information in emergency such as earthquakes, when search and rescue survivors in potential ares are required. However in current SC systems, task locations and worker locations are all exposed in public without any privacy protection. SC systems if attacked thus have penitential risk of privacy leakage. In this paper, we propose a protocol for protecting the privacy for both workers and task requesters while maintaining the functionality of SC systems. The proposed protocol is built on partially homomorphic encryption schemes, and can efficiently realize complex operations required during task assignment over encrypted data through a well-designed computation strategy. We prove that the proposed protocol is privacy-preserving against semi-honest adversaries. Simulation on two real-world datasets shows that the proposed protocol is more effective than existing solutions and can achieve mutual privacy-preserving with acceptable computation and communication cost.

  5. Computational modelling and analysis of hippocampal-prefrontal information coding during a spatial decision-making task

    Directory of Open Access Journals (Sweden)

    Thomas eJahans-Price

    2014-03-01

    Full Text Available We introduce a computational model describing rat behaviour and the interactions of neural populations processing spatial and mnemonic information during a maze-based, decision-making task. The model integrates sensory input and implements a working memory to inform decisions at a choice point, reproducing rat behavioural data and predicting the occurrence of turn- and memory-dependent activity in neuronal networks supporting task performance. We tested these model predictions using a new software toolbox (Maze Query Language, MQL to analyse activity of medial prefrontal cortical (mPFC and dorsal hippocampal (dCA1 neurons recorded from 6 adult rats during task performance. The firing rates of dCA1 neurons discriminated context (i.e. the direction of the previous turn, whilst a subset of mPFC neurons was selective for current turn direction or context, with some conjunctively encoding both. mPFC turn-selective neurons displayed a ramping of activity on approach to the decision turn and turn-selectivity in mPFC was significantly reduced during error trials. These analyses complement data from neurophysiological recordings in non-human primates indicating that firing rates of cortical neurons correlate with integration of sensory evidence used to inform decision-making.

  6. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  7. Independent tasks scheduling in cloud computing via improved estimation of distribution algorithm

    Science.gov (United States)

    Sun, Haisheng; Xu, Rui; Chen, Huaping

    2018-04-01

    To minimize makespan for scheduling independent tasks in cloud computing, an improved estimation of distribution algorithm (IEDA) is proposed to tackle the investigated problem in this paper. Considering that the problem is concerned with multi-dimensional discrete problems, an improved population-based incremental learning (PBIL) algorithm is applied, which the parameter for each component is independent with other components in PBIL. In order to improve the performance of PBIL, on the one hand, the integer encoding scheme is used and the method of probability calculation of PBIL is improved by using the task average processing time; on the other hand, an effective adaptive learning rate function that related to the number of iterations is constructed to trade off the exploration and exploitation of IEDA. In addition, both enhanced Max-Min and Min-Min algorithms are properly introduced to form two initial individuals. In the proposed IEDA, an improved genetic algorithm (IGA) is applied to generate partial initial population by evolving two initial individuals and the rest of initial individuals are generated at random. Finally, the sampling process is divided into two parts including sampling by probabilistic model and IGA respectively. The experiment results show that the proposed IEDA not only gets better solution, but also has faster convergence speed.

  8. Bridges to Swaziland: Using Task-Based Learning and Computer-Mediated Instruction to Improve English Language Teaching and Learning

    Science.gov (United States)

    Pierson, Susan Jacques

    2015-01-01

    One way to provide high quality instruction for underserved English Language Learners around the world is to combine Task-Based English Language Learning with Computer- Assisted Instruction. As part of an ongoing project, "Bridges to Swaziland," these approaches have been implemented in a determined effort to improve the ESL program for…

  9. Reasoning Abilities in Primary School: A Pilot Study on Poor Achievers vs. Normal Achievers in Computer Game Tasks

    Science.gov (United States)

    Dagnino, Francesca Maria; Ballauri, Margherita; Benigno, Vincenza; Caponetto, Ilaria; Pesenti, Elia

    2013-01-01

    This paper presents the results of preliminary research on the assessment of reasoning abilities in primary school poor achievers vs. normal achievers using computer game tasks. Subjects were evaluated by means of cognitive assessment on logical abilities and academic skills. The aim of this study is to better understand the relationship between…

  10. Quantifying light pollution

    International Nuclear Information System (INIS)

    Cinzano, P.; Falchi, F.

    2014-01-01

    In this paper we review new available indicators useful to quantify and monitor light pollution, defined as the alteration of the natural quantity of light in the night environment due to introduction of manmade light. With the introduction of recent radiative transfer methods for the computation of light pollution propagation, several new indicators become available. These indicators represent a primary step in light pollution quantification, beyond the bare evaluation of the night sky brightness, which is an observational effect integrated along the line of sight and thus lacking the three-dimensional information. - Highlights: • We review new available indicators useful to quantify and monitor light pollution. • These indicators are a primary step in light pollution quantification. • These indicators allow to improve light pollution mapping from a 2D to a 3D grid. • These indicators allow carrying out a tomography of light pollution. • We show an application of this technique to an Italian region

  11. Quantifying multiscale porosity and fracture aperture distribution in granite cores using computed tomography

    Science.gov (United States)

    Wenning, Quinn; Madonna, Claudio; Joss, Lisa; Pini, Ronny

    2017-04-01

    Knowledge of porosity and fracture (aperture) distribution is key towards a sound description of fluid transport in low-permeability rocks. In the context of geothermal energy development, the ability to quantify the transport properties of fractures is needed to in turn quantify the rate of heat transfer, and, accordingly, to optimize the engineering design of the operation. In this context, core-flooding experiments coupled with non-invasive imaging techniques (e.g., X-Ray Computed Tomography - X-Ray CT) represent a powerful tool for making direct observations of these properties under representative geologic conditions. This study focuses on quantifying porosity and fracture aperture distribution in a fractured westerly granite core by using two recently developed experimental protocols. The latter include the use of a highly attenuating gas [Vega et al., 2014] and the application of the so-called missing CT attenuation method [Huo et al., 2016] to produce multidimensional maps of the pore space and of the fractures. Prior to the imaging experiments, the westerly granite core (diameter: 5 cm, length: 10 cm) was thermally shocked to induce micro-fractured pore space; this was followed by the application of the so-called Brazilian method to induce a macroscopic fracture along the length of the core. The sample was then mounted in a high-pressure aluminum core-holder, exposed to a confining pressure and placed inside a medical CT scanner for imaging. An initial compressive pressure cycle was performed to remove weak asperities and reduce the hysteretic behavior of the fracture with respect to effective pressure. The CT scans were acquired at room temperature and 0.5, 5, 7, and 10 MPa effective pressure under loading and unloading conditions. During scanning the pore fluid pressure was undrained and constant, and the confining pressure was regulated at the desired pressure with a high precision pump. Highly transmissible krypton and helium gases were used as

  12. Single-Task and Dual-Task Gait Among Collegiate Athletes of Different Sport Classifications: Implications for Concussion Management.

    Science.gov (United States)

    Howell, David R; Oldham, Jessie R; DiFabio, Melissa; Vallabhajosula, Srikant; Hall, Eric E; Ketcham, Caroline J; Meehan, William P; Buckley, Thomas A

    2017-02-01

    Gait impairments have been documented following sport-related concussion. Whether preexisting gait pattern differences exist among athletes who participate in different sport classifications, however, remains unclear. Dual-task gait examinations probe the simultaneous performance of everyday tasks (ie, walking and thinking), and can quantify gait performance using inertial sensors. The purpose of this study was to compare the single-task and dual-task gait performance of collision/contact and noncontact athletes. A group of collegiate athletes (n = 265) were tested before their season at 3 institutions (mean age= 19.1 ± 1.1 years). All participants stood still (single-task standing) and walked while simultaneously completing a cognitive test (dual-task gait), and completed walking trials without the cognitive test (single-task gait). Spatial-temporal gait parameters were compared between collision/contact and noncontact athletes using MANCOVAs; cognitive task performance was compared using ANCOVAs. No significant single-task or dual-task gait differences were found between collision/contact and noncontact athletes. Noncontact athletes demonstrated higher cognitive task accuracy during single-task standing (P = .001) and dual-task gait conditions (P = .02) than collision/contact athletes. These data demonstrate the utility of a dual-task gait assessment outside of a laboratory and suggest that preinjury cognitive task performance during dual-tasks may differ between athletes of different sport classifications.

  13. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    Science.gov (United States)

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Relating UMLS semantic types and task-based ontology to computer-interpretable clinical practice guidelines.

    Science.gov (United States)

    Kumar, Anand; Ciccarese, Paolo; Quaglini, Silvana; Stefanelli, Mario; Caffi, Ezio; Boiocchi, Lorenzo

    2003-01-01

    Medical knowledge in clinical practice guideline (GL) texts is the source of task-based computer-interpretable clinical guideline models (CIGMs). We have used Unified Medical Language System (UMLS) semantic types (STs) to understand the percentage of GL text which belongs to a particular ST. We also use UMLS semantic network together with the CIGM-specific ontology to derive a semantic meaning behind the GL text. In order to achieve this objective, we took nine GL texts from the National Guideline Clearinghouse (NGC) and marked up the text dealing with a particular ST. The STs we took into consideration were restricted taking into account the requirements of a task-based CIGM. We used DARPA Agent Markup Language and Ontology Inference Layer (DAML + OIL) to create the UMLS and CIGM specific semantic network. For the latter, as a bench test, we used the 1999 WHO-International Society of Hypertension Guidelines for the Management of Hypertension. We took into consideration the UMLS STs closest to the clinical tasks. The percentage of the GL text dealing with the ST "Health Care Activity" and subtypes "Laboratory Procedure", "Diagnostic Procedure" and "Therapeutic or Preventive Procedure" were measured. The parts of text belonging to other STs or comments were separated. A mapping of terms belonging to other STs was done to the STs under "HCA" for representation in DAML + OIL. As a result, we found that the three STs under "HCA" were the predominant STs present in the GL text. In cases where the terms of related STs existed, they were mapped into one of the three STs. The DAML + OIL representation was able to describe the hierarchy in task-based CIGMs. To conclude, we understood that the three STs could be used to represent the semantic network of the task-bases CIGMs. We identified some mapping operators which could be used for the mapping of other STs into these.

  15. The influences of task repetition, napping, time of day, and instruction on the Sustained Attention to Response Task

    NARCIS (Netherlands)

    Schie, M.K.M. van; Alblas, E.E.; Thijs, R.D.; Fronczek, R.; Lammers, G.J.; Dijk, J.G. van

    2014-01-01

    Introduction: The Sustained Attention to Response Task (SART) helps to quantify vigilance impairments. Previous studies, in which five SART sessions on one day were administered, demonstrated worse performance during the first session than during the others. The present study comprises two

  16. Analysis of the priority of anatomic structures according to the diagnostic task in cone-beam computed tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jin Woo [Dept. of Oral and Maxillofacial Radiology, Dankook University College of Dentistry, Chunan (Korea, Republic of)

    2016-12-15

    This study was designed to evaluate differences in the required visibility of anatomic structures according to the diagnostic tasks of implant planning and periapical diagnosis. Images of a real skull phantom were acquired under 24 combinations of different exposure conditions in a cone-beam computed tomography scanner (60, 70, 80, 90, 100, and 110 kV and 4, 6, 8, and 10 mA). Five radiologists evaluated the visibility of anatomic structures and the image quality for diagnostic tasks using a 6-point scale. The visibility of the periodontal ligament space showed the closest association with the ability to use an image for periapical diagnosis in both jaws. The visibility of the sinus floor and canal wall showed the closest association with the ability to use an image for implant planning. Variations in tube voltage were associated with significant differences in image quality for all diagnostic tasks. However, tube current did not show significant associations with the ability to use an image for implant planning. The required visibility of anatomic structures varied depending on the diagnostic task. Tube voltage was a more important exposure parameter for image quality than tube current. Different settings should be used for optimization and image quality evaluation depending on the diagnostic task.

  17. Impact of task design on task performance and injury risk: case study of a simulated drilling task.

    Science.gov (United States)

    Alabdulkarim, Saad; Nussbaum, Maury A; Rashedi, Ehsan; Kim, Sunwook; Agnew, Michael; Gardner, Richard

    2017-06-01

    Existing evidence is limited regarding the influence of task design on performance and ergonomic risk, or the association between these two outcomes. In a controlled experiment, we constructed a mock fuselage to simulate a drilling task common in aircraft manufacturing, and examined the effect of three levels of workstation adjustability on performance as measured by productivity (e.g. fuselage completion time) and quality (e.g. fuselage defective holes), and ergonomic risk as quantified using two common methods (rapid upper limb assessment and the strain index). The primary finding was that both productivity and quality significantly improved with increased adjustability, yet this occurred only when that adjustability succeeded in reducing ergonomic risk. Supporting the inverse association between ergonomic risk and performance, the condition with highest adjustability created the lowest ergonomic risk and the best performance while there was not a substantial difference in ergonomic risk between the other two conditions, in which performance was also comparable. Practitioner Summary: Findings of this study supported a causal relationship between task design and both ergonomic risk and performance, and that ergonomic risk and performance are inversely associated. While future work is needed under more realistic conditions and a broader population, these results may be useful for task (re)design and to help cost-justify some ergonomic interventions.

  18. Crowdsourcing for quantifying transcripts: An exploratory study.

    Science.gov (United States)

    Azzam, Tarek; Harman, Elena

    2016-02-01

    This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Positron computed tomography studies of cerebral metabolic responses to complex motor tasks

    International Nuclear Information System (INIS)

    Phelps, M.E.; Mazziotta, J.C.

    1984-01-01

    Human motor system organization was explored in 8 right-handed male subjects using /sup 18/F-fluorodeoxyglucose and positron computed tomography to measure cerebral glucose metabolism. Five subjects had triple studies (eyes closed) including: control (hold pen in right hand without moving), normal size writing (subject repeatedly writes name) and large (10-15 X normal) name writing. In these studies normal and large size writing had a similar distribution of metabolic responses when compared to control studies. Activations (percent change from control) were in the range of 12-20% and occurred in the striatum bilaterally > contralateral Rolandic cortex > contralateral thalamus. No significant activations were observed in the ipsilateral thalamus, Rolandic cortex or cerebellum (supplementary motor cortex was not examined). The magnitude of the metabolic response in the striatum was greater with the large versus normal sized writing. This differential response may be due to an increased number and topographic distribution of neurons responding with the same average activity between tasks or an increase in the functional activity of the same neuronal population between the two tasks (present spatial resolution inadequate to differentiate). When subjects (N=3) performed novel sequential finger movements, the maximal metabolic response was in the contralateral Rolandic cortex > striatum. Such studies provide a means of exploring human motor system organization, motor learning and provide a basis for examining patients with motor system disorders

  20. Quantifying Distributional Model Risk via Optimal Transport

    OpenAIRE

    Blanchet, Jose; Murthy, Karthyek R. A.

    2016-01-01

    This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality, in particular, we provide examples involving path dependent expectations of stochastic processes. Our approach consists in computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms ...

  1. Optimal design method for a digital human–computer interface based on human reliability in a nuclear power plant. Part 3: Optimization method for interface task layout

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Wang, Yiqun; Zhang, Li; Xie, Tian; Li, Min; Peng, Yuyuan; Wu, Daqing; Li, Peiyao; Ma, Congmin; Shen, Mengxu; Wu, Xing; Weng, Mengyun; Wang, Shiwei; Xie, Cen

    2016-01-01

    Highlights: • The authors present an optimization algorithm for interface task layout. • The performing process of the proposed algorithm was depicted. • The performance evaluation method adopted neural network method. • The optimization layouts of an event interface tasks were obtained by experiments. - Abstract: This is the last in a series of papers describing the optimal design for a digital human–computer interface of a nuclear power plant (NPP) from three different points based on human reliability. The purpose of this series is to propose different optimization methods from varying perspectives to decrease human factor events that arise from the defects of a human–computer interface. The present paper mainly solves the optimization method as to how to effectively layout interface tasks into different screens. The purpose of this paper is to decrease human errors by reducing the distance that an operator moves among different screens in each operation. In order to resolve the problem, the authors propose an optimization process of interface task layout for digital human–computer interface of a NPP. As to how to automatically layout each interface task into one of screens in each operation, the paper presents a shortest moving path optimization algorithm with dynamic flag based on human reliability. To test the algorithm performance, the evaluation method uses neural network based on human reliability. The less the human error probabilities are, the better the interface task layouts among different screens are. Thus, by analyzing the performance of each interface task layout, the optimization result is obtained. Finally, the optimization layouts of spurious safety injection event interface tasks of the NPP are obtained by an experiment, the proposed methods has a good accuracy and stabilization.

  2. A computer simulation approach to quantify the true area and true area compressibility modulus of biological membranes

    International Nuclear Information System (INIS)

    Chacón, Enrique; Tarazona, Pedro; Bresme, Fernando

    2015-01-01

    We present a new computational approach to quantify the area per lipid and the area compressibility modulus of biological membranes. Our method relies on the analysis of the membrane fluctuations using our recently introduced coupled undulatory (CU) mode [Tarazona et al., J. Chem. Phys. 139, 094902 (2013)], which provides excellent estimates of the bending modulus of model membranes. Unlike the projected area, widely used in computer simulations of membranes, the CU area is thermodynamically consistent. This new area definition makes it possible to accurately estimate the area of the undulating bilayer, and the area per lipid, by excluding any contributions related to the phospholipid protrusions. We find that the area per phospholipid and the area compressibility modulus features a negligible dependence with system size, making possible their computation using truly small bilayers, involving a few hundred lipids. The area compressibility modulus obtained from the analysis of the CU area fluctuations is fully consistent with the Hooke’s law route. Unlike existing methods, our approach relies on a single simulation, and no a priori knowledge of the bending modulus is required. We illustrate our method by analyzing 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine bilayers using the coarse grained MARTINI force-field. The area per lipid and area compressibility modulus obtained with our method and the MARTINI forcefield are consistent with previous studies of these bilayers

  3. Planning and task management in Parkinson's disease: differential emphasis in dual-task performance.

    Science.gov (United States)

    Bialystok, Ellen; Craik, Fergus I M; Stefurak, Taresa

    2008-03-01

    Seventeen patients diagnosed with Parkinson's disease completed a complex computer-based task that involved planning and management while also performing an attention-demanding secondary task. The tasks were performed concurrently, but it was necessary to switch from one to the other. Performance was compared to a group of healthy age-matched control participants and a group of young participants. Parkinson's patients performed better than the age-matched controls on almost all measures and as well as the young controls in many cases. However, the Parkinson's patients achieved this by paying relatively less attention to the secondary task and focusing attention more on the primary task. Thus, Parkinson's patients can apparently improve their performance on some aspects of a multidimensional task by simplifying task demands. This benefit may occur as a consequence of their inflexible exaggerated attention to some aspects of a complex task to the relative neglect of other aspects.

  4. A Medical Research and Evaluation Facility (MREF) and Studies Supporting the Medical Chemical Defense Program: Task 95-39: Methods Development and Validation of Two Mouse Bioassays for Use in Quantifying Botulinum Toxins (A, B, C, D and E) and Toxin Antibody Titers

    National Research Council Canada - National Science Library

    Olson, Carl

    1997-01-01

    Ths task was conducted for the U.S. Army Medical Materiel Development Activity (USAMMDA) to validate two mouse bioassays for quantify botulinum toxin potency and neutralizing antibodies to botulimun toxins...

  5. Psychometric properties of startle and corrugator response in NPU, affective picture viewing, and resting state tasks.

    Science.gov (United States)

    Kaye, Jesse T; Bradford, Daniel E; Curtin, John J

    2016-08-01

    The current study provides a comprehensive evaluation of critical psychometric properties of commonly used psychophysiology laboratory tasks/measures within the NIMH RDoC. Participants (N = 128) completed the no-shock, predictable shock, unpredictable shock (NPU) task, affective picture viewing task, and resting state task at two study visits separated by 1 week. We examined potentiation/modulation scores in NPU (predictable or unpredictable shock vs. no-shock) and affective picture viewing tasks (pleasant or unpleasant vs. neutral pictures) for startle and corrugator responses with two commonly used quantification methods. We quantified startle potentiation/modulation scores with raw and standardized responses. We quantified corrugator potentiation/modulation in the time and frequency domains. We quantified general startle reactivity in the resting state task as the mean raw startle response during the task. For these three tasks, two measures, and two quantification methods, we evaluated effect size robustness and stability, internal consistency (i.e., split-half reliability), and 1-week temporal stability. The psychometric properties of startle potentiation in the NPU task were good, but concerns were noted for corrugator potentiation in this task. Some concerns also were noted for the psychometric properties of both startle and corrugator modulation in the affective picture viewing task, in particular, for pleasant picture modulation. Psychometric properties of general startle reactivity in the resting state task were good. Some salient differences in the psychometric properties of the NPU and affective picture viewing tasks were observed within and across quantification methods. © 2016 The Authors. Psychophysiology published by Wiley Periodicals, Inc. on behalf of Society for Psychophysiological Research.

  6. IMPORTANCE OF COMPUTER TECHNOLOGY IN REALIZATION OF CULTURAL AND EDUCATIONAL TASKS OF PRESCHOOL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Zvezdan Arsić

    2016-06-01

    Full Text Available The rapid scientific and technological development imposes numerous changes in all spheres of life and work. In such circumstances, a computer has become a part of all aspects of life: economy, education, free time, family. Since children in contemporary society increasingly acquire knowledge before the school age, the question is how to prepare them for the world in which we live, bearing in mind how significantly different it is from the world in which the previous generations grew up. The research was aimed at examining the attitudes of preschool teachers about the importance of computers in the realization of educational activities in preschool institutions. The study included 54 teachers from Kosovo and Metohija: Kosovska Mitrovica, Donja Gušterica and Ropotovo. The research results indicate that digital technology is a very important and a useful didactic tool in the realization of educational activities in preschool institutions and that preschool teachers have the required competence to implement the technology. However, they are not satisfied with the quality of their ICT education and training during their studies; they also feel that their institutions do not provide adequate working conditions for the use of computers in the realization of educational tasks.

  7. Risk assessments using the Strain Index and the TLV for HAL, Part I: Task and multi-task job exposure classifications.

    Science.gov (United States)

    Kapellusch, Jay M; Bao, Stephen S; Silverstein, Barbara A; Merryweather, Andrew S; Thiese, Mathew S; Hegmann, Kurt T; Garg, Arun

    2017-12-01

    The Strain Index (SI) and the American Conference of Governmental Industrial Hygienists (ACGIH) Threshold Limit Value for Hand Activity Level (TLV for HAL) use different constituent variables to quantify task physical exposures. Similarly, time-weighted-average (TWA), Peak, and Typical exposure techniques to quantify physical exposure from multi-task jobs make different assumptions about each task's contribution to the whole job exposure. Thus, task and job physical exposure classifications differ depending upon which model and technique are used for quantification. This study examines exposure classification agreement, disagreement, correlation, and magnitude of classification differences between these models and techniques. Data from 710 multi-task job workers performing 3,647 tasks were analyzed using the SI and TLV for HAL models, as well as with the TWA, Typical and Peak job exposure techniques. Physical exposures were classified as low, medium, and high using each model's recommended, or a priori limits. Exposure classification agreement and disagreement between models (SI, TLV for HAL) and between job exposure techniques (TWA, Typical, Peak) were described and analyzed. Regardless of technique, the SI classified more tasks as high exposure than the TLV for HAL, and the TLV for HAL classified more tasks as low exposure. The models agreed on 48.5% of task classifications (kappa = 0.28) with 15.5% of disagreement between low and high exposure categories. Between-technique (i.e., TWA, Typical, Peak) agreement ranged from 61-93% (kappa: 0.16-0.92) depending on whether the SI or TLV for HAL was used. There was disagreement between the SI and TLV for HAL and between the TWA, Typical and Peak techniques. Disagreement creates uncertainty for job design, job analysis, risk assessments, and developing interventions. Task exposure classifications from the SI and TLV for HAL might complement each other. However, TWA, Typical, and Peak job exposure techniques all have

  8. Cross-linguistic patterns in the acquisition of quantifiers

    Science.gov (United States)

    Cummins, Chris; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K.; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-01-01

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier’s specific meaning. We investigate competence with the expressions for “all,” “none,” “some,” “some…not,” and “most” in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation. PMID:27482119

  9. The Influence of Parkinson’s Disease Motor Symptom Asymmetry on Hand Performance: An Examination of the Grooved Pegboard Task

    Directory of Open Access Journals (Sweden)

    Sara M. Scharoun

    2015-01-01

    Full Text Available This study examined the influence of motor symptom asymmetry in Parkinson’s disease (PD on Grooved Pegboard (GP performance in right-handed participants. The Unified Parkinson’s Disease Rating Scale was used to assess motor symptoms and separate participants with PD into two groups (right-arm affected, left-arm affected for comparison with a group of healthy older adults. Participants completed the place and replace GP tasks two times with both hands. Laterality quotients were computed to quantify performance differences between the two hands. Comparisons among the three groups indicated that when the nonpreferred hand is affected by PD motor symptoms, superior preferred hand performance (as seen in healthy older adults is further exaggerated in tasks that require precision (i.e., place task. Regardless of the task, when the preferred hand is affected, there is an evident shift to superior left-hand performance, which may inevitably manifest as a switch in hand preference. Results add to the discussion of the relationship between handedness and motor symptom asymmetry in PD.

  10. A comparative study of 2 computer-assisted methods of quantifying brightfield microscopy images.

    Science.gov (United States)

    Tse, George H; Marson, Lorna P

    2013-10-01

    Immunohistochemistry continues to be a powerful tool for the detection of antigens. There are several commercially available software packages that allow image analysis; however, these can be complex, require relatively high level of computer skills, and can be expensive. We compared 2 commonly available software packages, Adobe Photoshop CS6 and ImageJ, in their ability to quantify percentage positive area after picrosirius red (PSR) staining and 3,3'-diaminobenzidine (DAB) staining. On analysis of DAB-stained B cells in the mouse spleen, with a biotinylated primary rat anti-mouse-B220 antibody, there was no significant difference on converting images from brightfield microscopy to binary images to measure black and white pixels using ImageJ compared with measuring a range of brown pixels with Photoshop (Student t test, P=0.243, correlation r=0.985). When analyzing mouse kidney allografts stained with PSR, Photoshop achieved a greater interquartile range while maintaining a lower 10th percentile value compared with analysis with ImageJ. A lower 10% percentile reflects that Photoshop analysis is better at analyzing tissues with low levels of positive pixels; particularly relevant for control tissues or negative controls, whereas after ImageJ analysis the same images would result in spuriously high levels of positivity. Furthermore comparing the 2 methods by Bland-Altman plot revealed that these 2 methodologies did not agree when measuring images with a higher percentage of positive staining and correlation was poor (r=0.804). We conclude that for computer-assisted analysis of images of DAB-stained tissue there is no difference between using Photoshop or ImageJ. However, for analysis of color images where differentiation into a binary pattern is not easy, such as with PSR, Photoshop is superior at identifying higher levels of positivity while maintaining differentiation of low levels of positive staining.

  11. The Effect of Motor Difficulty on the Acquisition of a Computer Task: A Comparison between Young and Older Adults

    Science.gov (United States)

    Fezzani, K.; Albinet, C.; Thon, B.; Marquie, J. -C.

    2010-01-01

    The present study investigated the extent to which the impact of motor difficulty on the acquisition of a computer task varies as a function of age. Fourteen young and 14 older participants performed 352 sequences of 10 serial pointing movements with a wireless pen on a digitiser tablet. A conditional probabilistic structure governed the…

  12. PUMA Internet Task Logging Using the IDAC-1

    Directory of Open Access Journals (Sweden)

    K. N. Tarchanidis

    2014-08-01

    Full Text Available This project uses an IDAC-1 board to sample the joint angle position of the PUMA 76 1 robot and log the results on a computer. The robot is at the task location and the logging computer is located in a different one. The task the robot is performing is based on a Pseudo Stereo Vision System (PSVS. Internet is the transport media. The protocol used in this project is UDP/IP. The actual angle is taken straight from the PUMA controller. High-resolution potentiometers are connected on each robot joint and are buffered and sampled as potential difference on an A/D converter integrated on the IDAC-1. The logging computer through the Internet acting as client asks for the angle set, the IDAC-1 responds as server with the 10-bit resolution sampling of the joint position. The whole task is logged in a file on the logging computer. This application can give the ability to the Internet user to monitor and log the robot tasks anywhere in the Word Wide Web (www.

  13. Simultaneous Budget and Buffer Size Computation for Throughput-Constrained Task Graphs

    NARCIS (Netherlands)

    Wiggers, M.H.; Bekooij, Marco Jan Gerrit; Geilen, Marc C.W.; Basten, Twan

    Modern embedded multimedia systems process multiple concurrent streams of data processing jobs. Streams often have throughput requirements. These jobs are implemented on a multiprocessor system as a task graph. Tasks communicate data over buffers, where tasks wait on sufficient space in output

  14. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  15. Mobile Thread Task Manager

    Science.gov (United States)

    Clement, Bradley J.; Estlin, Tara A.; Bornstein, Benjamin J.

    2013-01-01

    The Mobile Thread Task Manager (MTTM) is being applied to parallelizing existing flight software to understand the benefits and to develop new techniques and architectural concepts for adapting software to multicore architectures. It allocates and load-balances tasks for a group of threads that migrate across processors to improve cache performance. In order to balance-load across threads, the MTTM augments a basic map-reduce strategy to draw jobs from a global queue. In a multicore processor, memory may be "homed" to the cache of a specific processor and must be accessed from that processor. The MTTB architecture wraps access to data with thread management to move threads to the home processor for that data so that the computation follows the data in an attempt to avoid L2 cache misses. Cache homing is also handled by a memory manager that translates identifiers to processor IDs where the data will be homed (according to rules defined by the user). The user can also specify the number of threads and processors separately, which is important for tuning performance for different patterns of computation and memory access. MTTM efficiently processes tasks in parallel on a multiprocessor computer. It also provides an interface to make it easier to adapt existing software to a multiprocessor environment.

  16. Quantifying Pollutant Emissions from Office Equipment Phase IReport

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.; McKone, T.E.; Perino, C.

    2006-12-01

    Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants of interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment

  17. Kokkos' Task DAG Capabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Harold C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ibanez, Daniel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report documents the ASC/ATDM Kokkos deliverable "Production Portable Dy- namic Task DAG Capability." This capability enables applications to create and execute a dynamic task DAG ; a collection of heterogeneous computational tasks with a directed acyclic graph (DAG) of "execute after" dependencies where tasks and their dependencies are dynamically created and destroyed as tasks execute. The Kokkos task scheduler executes the dynamic task DAG on the target execution resource; e.g. a multicore CPU, a manycore CPU such as Intel's Knights Landing (KNL), or an NVIDIA GPU. Several major technical challenges had to be addressed during development of Kokkos' Task DAG capability: (1) portability to a GPU with it's simplified hardware and micro- runtime, (2) thread-scalable memory allocation and deallocation from a bounded pool of memory, (3) thread-scalable scheduler for dynamic task DAG, (4) usability by applications.

  18. AUTOMATION PROGRAM FOR RECOGNITION OF ALGORITHM SOLUTION OF MATHEMATIC TASK

    Directory of Open Access Journals (Sweden)

    Denis N. Butorin

    2014-01-01

    Full Text Available In the article are been describing technology for manage of testing task in computer program. It was found for recognition of algorithm solution of mathematic task. There are been justifi ed the using hierarchical structure for a special set of testing questions. Also, there has been presented the release of the described tasks in the computer program openSEE. 

  19. AUTOMATION PROGRAM FOR RECOGNITION OF ALGORITHM SOLUTION OF MATHEMATIC TASK

    OpenAIRE

    Denis N. Butorin

    2014-01-01

    In the article are been describing technology for manage of testing task in computer program. It was found for recognition of algorithm solution of mathematic task. There are been justifi ed the using hierarchical structure for a special set of testing questions. Also, there has been presented the release of the described tasks in the computer program openSEE. 

  20. A Heuristic Task Scheduling Algorithm for Heterogeneous Virtual Clusters

    OpenAIRE

    Weiwei Lin; Wentai Wu; James Z. Wang

    2016-01-01

    Cloud computing provides on-demand computing and storage services with high performance and high scalability. However, the rising energy consumption of cloud data centers has become a prominent problem. In this paper, we first introduce an energy-aware framework for task scheduling in virtual clusters. The framework consists of a task resource requirements prediction module, an energy estimate module, and a scheduler with a task buffer. Secondly, based on this framework, we propose a virtual ...

  1. Performance analysis of cloud computing services for many-tasks scientific computing

    NARCIS (Netherlands)

    Iosup, A.; Ostermann, S.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.

    2011-01-01

    Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike. Through the use of virtualization and resource time sharing, clouds serve with a single set of physical resources a

  2. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  3. Task-Oriented Training with Computer Games for People with Rheumatoid Arthritis or Hand Osteoarthritis: A Feasibility Randomized Controlled Trial.

    Science.gov (United States)

    Srikesavan, Cynthia Swarnalatha; Shay, Barbara; Szturm, Tony

    2016-09-13

    To examine the feasibility of a clinical trial on a novel, home-based task-oriented training with conventional hand exercises in people with rheumatoid arthritis or hand osteoarthritis. To explore the experiences of participants who completed their respective home exercise programmes. Thirty volunteer participants aged between 30 and 60 years and diagnosed with rheumatoid arthritis or hand osteoarthritis were proposed for a single-center, assessor-blinded, randomized controlled trial ( ClinicalTrials.gov : NCT01635582). Participants received task-oriented training with interactive computer games and objects of daily life or finger mobility and strengthening exercises. Both programmes were home based and were done four sessions per week with 20 minutes each session for 6 weeks. Major feasibility outcomes were number of volunteers screened, randomized, and retained; completion of blinded assessments, exercise training, and home exercise sessions; equipment and data management; and clinical outcomes of hand function. Reaching the recruitment target in 18 months and achieving exercise compliance >80% were set as success criteria. Concurrent with the trial, focus group interviews explored experiences of those participants who completed their respective programmes. After trial initiation, revisions in inclusion criteria were required to promote recruitment. A total of 17 participants were randomized and 15 were retained. Completion of assessments, exercise training, and home exercise sessions; equipment and data collection and management demonstrated excellent feasibility. Both groups improved in hand function outcomes and exercise compliance was above 85%. Participants perceived both programmes as appropriate and acceptable. Participants who completed task-oriented training also agreed that playing different computer games was enjoyable, engaging, and motivating. Findings demonstrate initial evidence on recruitment, feasibility of trial procedures, and acceptability of

  4. Motivation and engagement in computer-based learning tasks: investigating key contributing factors

    Directory of Open Access Journals (Sweden)

    Michela Ott, Mauro Tavella

    2010-04-01

    Full Text Available This paper, drawing on a research project concerning the educational use of digital mind games with primary school students, aims at giving a contribution to the understanding of which are the main factors influencing student motivation during computer-based learning activities. It puts forward some ideas and experience based reflections, starting by considering digital games that are widely recognized as the most promising ICT tools to enhance student motivation. The project results suggest that student genuine engagement in learning activities is mainly related to the actual possession of the skills and of the cognitive capacities needed to perform the task. In this perspective, cognitive overload should be regarded as one of the main reasons contributing to hinder student motivation and, consequently, should be avoided. Other elements such as game attractiveness and experimental setting constraints resulted to have a lower effect on student motivation.

  5. The ATB Framework : quantifying and Classifying Epistemic Strategies in Tangible Problem-Solving Tasks

    NARCIS (Netherlands)

    Esteves, A.E.; Bakker, S.; Antle, A.N. (Alissa); May, A.; Warren, J.; Oakley, I.

    2015-01-01

    In task performance, pragmatic actions refer to behaviors that make direct progress, while epistemic actions involve altering the world so that cognitive processes are faster, more reliable or less taxing. Epistemic actions are frequently presented as a beneficial consequence of interacting with

  6. Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research

    Science.gov (United States)

    Arnegard, Ruth J.; Comstock, J. R., Jr.

    1991-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  7. Can smartphones be used to bring computer-based tasks from the lab to the field? A mobile experience-sampling method study about the pace of life.

    Science.gov (United States)

    Stieger, Stefan; Lewetz, David; Reips, Ulf-Dietrich

    2017-12-06

    Researchers are increasingly using smartphones to collect scientific data. To date, most smartphone studies have collected questionnaire data or data from the built-in sensors. So far, few studies have analyzed whether smartphones can also be used to conduct computer-based tasks (CBTs). Using a mobile experience-sampling method study and a computer-based tapping task as examples (N = 246; twice a day for three weeks, 6,000+ measurements), we analyzed how well smartphones can be used to conduct a CBT. We assessed methodological aspects such as potential technologically induced problems, dropout, task noncompliance, and the accuracy of millisecond measurements. Overall, we found few problems: Dropout rate was low, and the time measurements were very accurate. Nevertheless, particularly at the beginning of the study, some participants did not comply with the task instructions, probably because they did not read the instructions before beginning the task. To summarize, the results suggest that smartphones can be used to transfer CBTs from the lab to the field, and that real-world variations across device manufacturers, OS types, and CPU load conditions did not substantially distort the results.

  8. Brain-computer interfaces increase whole-brain signal to noise.

    Science.gov (United States)

    Papageorgiou, T Dorina; Lisinski, Jonathan M; McHenry, Monica A; White, Jason P; LaConte, Stephen M

    2013-08-13

    Brain-computer interfaces (BCIs) can convert mental states into signals to drive real-world devices, but it is not known if a given covert task is the same when performed with and without BCI-based control. Using a BCI likely involves additional cognitive processes, such as multitasking, attention, and conflict monitoring. In addition, it is challenging to measure the quality of covert task performance. We used whole-brain classifier-based real-time functional MRI to address these issues, because the method provides both classifier-based maps to examine the neural requirements of BCI and classification accuracy to quantify the quality of task performance. Subjects performed a covert counting task at fast and slow rates to control a visual interface. Compared with the same task when viewing but not controlling the interface, we observed that being in control of a BCI improved task classification of fast and slow counting states. Additional BCI control increased subjects' whole-brain signal-to-noise ratio compared with the absence of control. The neural pattern for control consisted of a positive network comprised of dorsal parietal and frontal regions and the anterior insula of the right hemisphere as well as an expansive negative network of regions. These findings suggest that real-time functional MRI can serve as a platform for exploring information processing and frontoparietal and insula network-based regulation of whole-brain task signal-to-noise ratio.

  9. Convolutional neural networks and face recognition task

    Science.gov (United States)

    Sochenkova, A.; Sochenkov, I.; Makovetskii, A.; Vokhmintsev, A.; Melnikov, A.

    2017-09-01

    Computer vision tasks are remaining very important for the last couple of years. One of the most complicated problems in computer vision is face recognition that could be used in security systems to provide safety and to identify person among the others. There is a variety of different approaches to solve this task, but there is still no universal solution that would give adequate results in some cases. Current paper presents following approach. Firstly, we extract an area containing face, then we use Canny edge detector. On the next stage we use convolutional neural networks (CNN) to finally solve face recognition and person identification task.

  10. The Urban Forest Effects (UFORE) model: quantifying urban forest structure and functions

    Science.gov (United States)

    David J. Nowak; Daniel E. Crane

    2000-01-01

    The Urban Forest Effects (UFORE) computer model was developed to help managers and researchers quantify urban forest structure and functions. The model quantifies species composition and diversity, diameter distribution, tree density and health, leaf area, leaf biomass, and other structural characteristics; hourly volatile organic compound emissions (emissions that...

  11. Development of a task difficulty measure for emergency operating procedures using entropy concepts

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Kim, Jaewhan; Ha Jaejoo

    2000-01-01

    In this paper, a method to quantify the degree of step complexity (SC) of an EOP step is developed based on entropy measures that have been used to evaluate software complexity in software engineering. Developed measure consists of three sub-measures that can quantify step complexity from various viewpoints. To verify developed SC measure, estimated SC values are compared with subjective task load scores obtained from the NASA-TLX (task load index) technique and the step performance time data obtained from a full scope simulator. From these comparisons, it can be concluded that the SC measure seems be appropriate for step difficulty evaluations because estimated SC values generally agree with the results of subjective task load scores the step performance time data. (author)

  12. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  13. Text-Based Language Teaching and the Analysis of Tasks Presented in English Course Books for Students of Information Technology and Computing

    Directory of Open Access Journals (Sweden)

    Valerija Marina

    2011-04-01

    Full Text Available The paper describes the essential features of a connected text helping to raise learners’ awareness of its structure and organization and improve their skills of reading comprehension. Classroom applications of various approaches to handling texts and text-based activities are also discussed and their main advantages and disadvantages are outlined.Tasks based on text transformation and reconstruction found in the course books of English for students of computing and information technology are analysed and their types are determined. The efficiency of the tasks is determined by considering the experience of the authors gained in using text-based assignments provided in these course books with the students of the above specialities. Some problems encountered in classroom application of the considered text-based tasks are also outlined.

  14. Influence of dual-tasking with different levels of attention diversion on characteristics of the movement-related cortical potential.

    Science.gov (United States)

    Aliakbaryhosseinabadi, Susan; Kamavuako, Ernest Nlandu; Jiang, Ning; Farina, Dario; Mrachacz-Kersting, Natalie

    2017-11-01

    Dual tasking is defined as performing two tasks concurrently and has been shown to have a significant effect on attention directed to the performance of the main task. In this study, an attention diversion task with two different levels was administered while participants had to complete a cue-based motor task consisting of foot dorsiflexion. An auditory oddball task with two levels of complexity was implemented to divert the user's attention. Electroencephalographic (EEG) recordings were made from nine single channels. Event-related potentials (ERPs) confirmed that the oddball task of counting a sequence of two tones decreased the auditory P300 amplitude more than the oddball task of counting one target tone among three different tones. Pre-movement features quantified from the movement-related cortical potential (MRCP) were changed significantly between single and dual-task conditions in motor and fronto-central channels. There was a significant delay in movement detection for the case of single tone counting in two motor channels only (237.1-247.4ms). For the task of sequence counting, motor cortex and frontal channels showed a significant delay in MRCP detection (232.1-250.5ms). This study investigated the effect of attention diversion in dual-task conditions by analysing both ERPs and MRCPs in single channels. The higher attention diversion lead to a significant reduction in specific MRCP features of the motor task. These results suggest that attention division in dual-tasking situations plays an important role in movement execution and detection. This has important implications in designing real-time brain-computer interface systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    Directory of Open Access Journals (Sweden)

    P. O. Umenne

    2012-12-01

    Full Text Available Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ execution were developed at the University of Surrey, UK in the 90s. The objective of the research was to develop a software-based computer architecture on which Agents execution could be explored. The combination of Intelligent Agents and HYDRA computer architecture gave rise to a new computer concept: the NET-Computer in which the comput­ing resources reside on the Internet. The Internet computers form the hardware and software resources, and the user is provided with a simple interface to access the Internet and run user tasks. The Agents autonomously roam the Internet (NET-Computer executing the tasks. A growing segment of the Internet is E-Commerce for online shopping for products and services. The Internet computing resources provide a marketplace for product suppliers and consumers alike. Consumers are looking for suppliers selling products and services, while suppliers are looking for buyers. Searching the vast amount of information available on the Internet causes a great deal of problems for both consumers and suppliers. Intelligent Agents executing on the NET-Computer can surf through the Internet and select specific information of interest to the user. The simulation results show that Intelligent Agents executing HYDRA computer architecture could be applied in E-Commerce.

  16. Rules and more rules: the effects of multiple tasks, extensive training, and aging on task-switching performance.

    Science.gov (United States)

    Buchler, Norbou G; Hoyer, William J; Cerella, John

    2008-06-01

    Task-switching performance was assessed in young and older adults as a function of the number of task sets to be actively maintained in memory (varied from 1 to 4) over the course of extended training (5 days). Each of the four tasks required the execution of a simple computational algorithm, which was instantaneously cued by the color of the two-digit stimulus. Tasks were presented in pure (task set size 1) and mixed blocks (task set sizes 2, 3, 4), and the task sequence was unpredictable. By considering task switching beyond two tasks, we found evidence for a cognitive control system that is not overwhelmed by task set size load manipulations. Extended training eliminated age effects in task-switching performance, even when the participants had to manage the execution of up to four tasks. The results are discussed in terms of current theories of cognitive control, including task set inertia and production system postulates.

  17. Strategic Adaptation to Task Characteristics, Incentives, and Individual Differences in Dual-Tasking.

    Directory of Open Access Journals (Sweden)

    Christian P Janssen

    Full Text Available We investigate how good people are at multitasking by comparing behavior to a prediction of the optimal strategy for dividing attention between two concurrent tasks. In our experiment, 24 participants had to interleave entering digits on a keyboard with controlling a randomly moving cursor with a joystick. The difficulty of the tracking task was systematically varied as a within-subjects factor. Participants were also exposed to different explicit reward functions that varied the relative importance of the tracking task relative to the typing task (between-subjects. Results demonstrate that these changes in task characteristics and monetary incentives, together with individual differences in typing ability, influenced how participants choose to interleave tasks. This change in strategy then affected their performance on each task. A computational cognitive model was used to predict performance for a wide set of alternative strategies for how participants might have possibly interleaved tasks. This allowed for predictions of optimal performance to be derived, given the constraints placed on performance by the task and cognition. A comparison of human behavior with the predicted optimal strategy shows that participants behaved near optimally. Our findings have implications for the design and evaluation of technology for multitasking situations, as consideration should be given to the characteristics of the task, but also to how different users might use technology depending on their individual characteristics and their priorities.

  18. The scopolamine-reversal paradigm in rats and monkeys: the importance of computer-assisted operant-conditioning memory tasks for screening drug candidates.

    Science.gov (United States)

    Buccafusco, Jerry J; Terry, Alvin V; Webster, Scott J; Martin, Daniel; Hohnadel, Elizabeth J; Bouchard, Kristy A; Warner, Samantha E

    2008-08-01

    The scopolamine-reversal model is enjoying a resurgence of interest in clinical studies as a reversible pharmacological model for Alzheimer's disease (AD). The cognitive impairment associated with scopolamine is similar to that in AD. The scopolamine model is not simply a cholinergic model, as it can be reversed by drugs that are noncholinergic cognition-enhancing agents. The objective of the study was to determine relevance of computer-assisted operant-conditioning tasks in the scopolamine-reversal model in rats and monkeys. Rats were evaluated for their acquisition of a spatial reference memory task in the Morris water maze. A separate cohort was proficient in performance of an automated delayed stimulus discrimination task (DSDT). Rhesus monkeys were proficient in the performance of an automated delayed matching-to-sample task (DMTS). The AD drug donepezil was evaluated for its ability to reverse the decrements in accuracy induced by scopolamine administration in all three tasks. In the DSDT and DMTS tasks, the effects of donepezil were delay (retention interval)-dependent, affecting primarily short delay trials. Donepezil produced significant but partial reversals of the scopolamine-induced impairment in task accuracies after 2 mg/kg in the water maze, after 1 mg/kg in the DSDT, and after 50 microg/kg in the DMTS task. The two operant-conditioning tasks (DSDT and DMTS) provided data most in keeping with those reported in clinical studies with these drugs. The model applied to nonhuman primates provides an excellent transitional model for new cognition-enhancing drugs before clinical trials.

  19. The effects of dual tasking on gait synchronization during over-ground side-by-side walking.

    Science.gov (United States)

    Zivotofsky, Ari Z; Bernad-Elazari, Hagar; Grossman, Pnina; Hausdorff, Jeffrey M

    2018-03-23

    Recent studies have shown that gait synchronization during natural walking is not merely anecdotal, but it is a repeatable phenomenon that is quantifiable and is apparently related to available sensory feedback modalities. However, the mechanisms underlying this phase-locking of gait have only recently begun to be investigated. For example, it is not known what role, if any, attention plays. We employed a dual tasking paradigm in order to investigate the role attention plays in gait synchronization. Sixteen pairs of subjects walked under six conditions that manipulated the available sensory feedback and the degree of difficulty of the dual task, i.e., the attention. Movement was quantified using a trunk-mounted tri-axial accelerometer. A gait synchronization index (GSI) was calculated in order to quantify the degree of synchronization of the gait pattern. A simple dual task resulted in an increased level of synchronization, whereas a more complex dual task lead to a reduction in synchronization. Handholding increased synchronization, compared to the same attention condition without handholding. These results indicate that in order for two walkers to synchronize, some level of attention is apparently required, such that a relatively complex dual task utilizes enough attentional resources to reduce the occurrence of synchronization. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Trunk sway analysis to quantify the warm-up phenomenon in myotonia congenita patients.

    NARCIS (Netherlands)

    Horlings, G.C.; Drost, G.; Bloem, B.R.; Trip, J.; Pieterse, A.J.; Engelen, B.G.M. van; Allum, J.H.J.

    2009-01-01

    OBJECTIVE: Patients with autosomal recessive myotonia congenita display myotonia and transient paresis that diminish with repetitive muscle contractions (warm-up phenomenon). A new approach is presented to quantify this warm-up phenomenon under clinically relevant gait and balance tasks. METHODS:

  1. Reinforcement learning in computer vision

    Science.gov (United States)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  2. Quantifying kinematics of purposeful movements to real, imagined, or absent functional objects: implications for modelling trajectories for robot-assisted ADL tasks.

    Science.gov (United States)

    Wisneski, Kimberly J; Johnson, Michelle J

    2007-03-23

    Robotic therapy is at the forefront of stroke rehabilitation. The Activities of Daily Living Exercise Robot (ADLER) was developed to improve carryover of gains after training by combining the benefits of Activities of Daily Living (ADL) training (motivation and functional task practice with real objects), with the benefits of robot mediated therapy (repeatability and reliability). In combining these two therapy techniques, we seek to develop a new model for trajectory generation that will support functional movements to real objects during robot training. We studied natural movements to real objects and report on how initial reaching movements are affected by real objects and how these movements deviate from the straight line paths predicted by the minimum jerk model, typically used to generate trajectories in robot training environments. We highlight key issues that to be considered in modelling natural trajectories. Movement data was collected as eight normal subjects completed ADLs such as drinking and eating. Three conditions were considered: object absent, imagined, and present. This data was compared to predicted trajectories generated from implementing the minimum jerk model. The deviations in both the plane of the table (XY) and the sagittal plane of torso (XZ) were examined for both reaches to a cup and to a spoon. Velocity profiles and curvature were also quantified for all trajectories. We hypothesized that movements performed with functional task constraints and objects would deviate from the minimum jerk trajectory model more than those performed under imaginary or object absent conditions. Trajectory deviations from the predicted minimum jerk model for these reaches were shown to depend on three variables: object presence, object orientation, and plane of movement. When subjects completed the cup reach their movements were more curved than for the spoon reach. The object present condition for the cup reach showed more curvature than in the object

  3. Quantifying kinematics of purposeful movements to real, imagined, or absent functional objects: Implications for modelling trajectories for robot-assisted ADL tasks**

    Directory of Open Access Journals (Sweden)

    Wisneski Kimberly J

    2007-03-01

    Full Text Available Abstract Background Robotic therapy is at the forefront of stroke rehabilitation. The Activities of Daily Living Exercise Robot (ADLER was developed to improve carryover of gains after training by combining the benefits of Activities of Daily Living (ADL training (motivation and functional task practice with real objects, with the benefits of robot mediated therapy (repeatability and reliability. In combining these two therapy techniques, we seek to develop a new model for trajectory generation that will support functional movements to real objects during robot training. We studied natural movements to real objects and report on how initial reaching movements are affected by real objects and how these movements deviate from the straight line paths predicted by the minimum jerk model, typically used to generate trajectories in robot training environments. We highlight key issues that to be considered in modelling natural trajectories. Methods Movement data was collected as eight normal subjects completed ADLs such as drinking and eating. Three conditions were considered: object absent, imagined, and present. This data was compared to predicted trajectories generated from implementing the minimum jerk model. The deviations in both the plane of the table (XY and the saggital plane of torso (XZ were examined for both reaches to a cup and to a spoon. Velocity profiles and curvature were also quantified for all trajectories. Results We hypothesized that movements performed with functional task constraints and objects would deviate from the minimum jerk trajectory model more than those performed under imaginary or object absent conditions. Trajectory deviations from the predicted minimum jerk model for these reaches were shown to depend on three variables: object presence, object orientation, and plane of movement. When subjects completed the cup reach their movements were more curved than for the spoon reach. The object present condition for the cup

  4. Effects of Distracting Task with Different Mental Workload on Steady-State Visual Evoked Potential Based Brain Computer Interfaces—an Offline Study

    Directory of Open Access Journals (Sweden)

    Yawei Zhao

    2018-02-01

    Full Text Available Brain-computer interfaces (BCIs, independent of the brain's normal output pathways, are attracting an increasing amount of attention as devices that extract neural information. As a typical type of BCI system, the steady-state visual evoked potential (SSVEP-based BCIs possess a high signal-to-noise ratio and information transfer rate. However, the current high speed SSVEP-BCIs were implemented with subjects concentrating on stimuli, and intentionally avoided additional tasks as distractors. This paper aimed to investigate how a distracting simultaneous task, a verbal n-back task with different mental workload, would affect the performance of SSVEP-BCI. The results from fifteen subjects revealed that the recognition accuracy of SSVEP-BCI was significantly impaired by the distracting task, especially under a high mental workload. The average classification accuracy across all subjects dropped by 8.67% at most from 1- to 4-back, and there was a significant negative correlation (maximum r = −0.48, p < 0.001 between accuracy and subjective mental workload evaluation of the distracting task. This study suggests a potential hindrance for the SSVEP-BCI daily use, and then improvements should be investigated in the future studies.

  5. Adaptive Incremental Genetic Algorithm for Task Scheduling in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kairong Duan

    2018-05-01

    Full Text Available Cloud computing is a new commercial model that enables customers to acquire large amounts of virtual resources on demand. Resources including hardware and software can be delivered as services and measured by specific usage of storage, processing, bandwidth, etc. In Cloud computing, task scheduling is a process of mapping cloud tasks to Virtual Machines (VMs. When binding the tasks to VMs, the scheduling strategy has an important influence on the efficiency of datacenter and related energy consumption. Although many traditional scheduling algorithms have been applied in various platforms, they may not work efficiently due to the large number of user requests, the variety of computation resources and complexity of Cloud environment. In this paper, we tackle the task scheduling problem which aims to minimize makespan by Genetic Algorithm (GA. We propose an incremental GA which has adaptive probabilities of crossover and mutation. The mutation and crossover rates change according to generations and also vary between individuals. Large numbers of tasks are randomly generated to simulate various scales of task scheduling problem in Cloud environment. Based on the instance types of Amazon EC2, we implemented virtual machines with different computing capacity on CloudSim. We compared the performance of the adaptive incremental GA with that of Standard GA, Min-Min, Max-Min , Simulated Annealing and Artificial Bee Colony Algorithm in finding the optimal scheme. Experimental results show that the proposed algorithm can achieve feasible solutions which have acceptable makespan with less computation time.

  6. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  7. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    OpenAIRE

    P. O. Umenne; M. O. Odhiambo

    2012-01-01

    Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ ex...

  8. Primary or secondary tasks? Dual-task interference between cyclist hazard perception and cadence control using cross-modal sensory aids with rider assistance bike computers.

    Science.gov (United States)

    Yang, Chao-Yang; Wu, Cheng-Tse

    2017-03-01

    This research investigated the risks involved in bicycle riding while using various sensory modalities to deliver training information. To understand the risks associated with using bike computers, this study evaluated hazard perception performance through lab-based simulations of authentic riding conditions. Analysing hazard sensitivity (d') of signal detection theory, the rider's response time, and eye glances provided insights into the risks of using bike computers. In this study, 30 participants were tested with eight hazard perception tasks while they maintained a cadence of 60 ± 5 RPM and used bike computers with different sensory displays, namely visual, auditory, and tactile feedback signals. The results indicated that synchronously using different sense organs to receive cadence feedback significantly affects hazard perception performance; direct visual information leads to the worst rider distraction, with a mean sensitivity to hazards (d') of -1.03. For systems with multiple interacting sensory aids, auditory aids were found to result in the greatest reduction in sensitivity to hazards (d' mean = -0.57), whereas tactile sensory aids reduced the degree of rider distraction (d' mean = -0.23). Our work complements existing work in this domain by advancing the understanding of how to design devices that deliver information subtly, thereby preventing disruption of a rider's perception of road hazards. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.

    Science.gov (United States)

    Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie

    2016-12-01

    An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.

  10. Discovery of high-level tasks in the operating room

    NARCIS (Netherlands)

    Bouarfa, L.; Jonker, P.P.; Dankelman, J.

    2010-01-01

    Recognizing and understanding surgical high-level tasks from sensor readings is important for surgical workflow analysis. Surgical high-level task recognition is also a challenging task in ubiquitous computing because of the inherent uncertainty of sensor data and the complexity of the operating

  11. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  12. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  13. Checkpointing for a hybrid computing node

    Science.gov (United States)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  14. A Computational Framework for Quantifying and Optimizing the Performance of Observational Networks in 4D-Var Data Assimilation

    Science.gov (United States)

    Cioaca, Alexandru

    A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimila- tion is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as

  15. Task-oriented structural design of manipulators based on operability evaluation

    International Nuclear Information System (INIS)

    Kotosaka, Shin-ya; Asama, Hajime; Takata, Shozo; Hiraoka, Hiroyuki; Kohda, Takehisa; Matsumoto, Akihiro; Endo, Isao.

    1995-01-01

    In this paper, a new method for designing the structure of manipulators based on evaluation of their adaptability to tasks is proposed. In the method, task directions are classified into three kinds of direction; operational direction, constrained direction and free direction. On each direction, condition of constraints by task environment is represented. The tasks are represented by a set of direction and condition of constraints. A new criterion, operability, is defined to quantify adaptability of manipulator to tasks, taking account of mobility in operational directions and immobility in constrained directions. The mobility and immobility is calculated based on the Jacobian matrix of manipulator. The operability evaluation method is implemented, and applied to structural design of manipulators, in which link parameters are optimized by the genetic algorithm. This system can derive suitable structure of manipulator to various tasks. The effectiveness of the system is shown concerning examples of welding tasks. (author)

  16. Assessing Human Judgment of Computationally Generated Swarming Behavior

    Directory of Open Access Journals (Sweden)

    John Harvey

    2018-02-01

    Full Text Available Computer-based swarm systems, aiming to replicate the flocking behavior of birds, were first introduced by Reynolds in 1987. In his initial work, Reynolds noted that while it was difficult to quantify the dynamics of the behavior from the model, observers of his model immediately recognized them as a representation of a natural flock. Considerable analysis has been conducted since then on quantifying the dynamics of flocking/swarming behavior. However, no systematic analysis has been conducted on human identification of swarming. In this paper, we assess subjects’ assessment of the behavior of a simplified version of Reynolds’ model. Factors that affect the identification of swarming are discussed and future applications of the resulting models are proposed. Differences in decision times for swarming-related questions asked during the study indicate that different brain mechanisms may be involved in different elements of the behavior assessment task. The relatively simple but finely tunable model used in this study provides a useful methodology for assessing individual human judgment of swarming behavior.

  17. GPScheDVS: A New Paradigm of the Autonomous CPU Speed Control for Commodity-OS-based General-Purpose Mobile Computers with a DVS-friendly Task Scheduling

    OpenAIRE

    Kim, Sookyoung

    2008-01-01

    This dissertation studies the problem of increasing battery life-time and reducing CPU heat dissipation without degrading system performance in commodity-OS-based general-purpose (GP) mobile computers using the dynamic voltage scaling (DVS) function of modern CPUs. The dissertation especially focuses on the impact of task scheduling on the effectiveness of DVS in achieving this goal. The task scheduling mechanism used in most contemporary general-purpose operating systems (GPOS) prioritizes t...

  18. Asynchronous Task-Based Polar Decomposition on Manycore Architectures

    KAUST Repository

    Sukkari, Dalal

    2016-10-25

    This paper introduces the first asynchronous, task-based implementation of the polar decomposition on manycore architectures. Based on a new formulation of the iterative QR dynamically-weighted Halley algorithm (QDWH) for the calculation of the polar decomposition, the proposed implementation replaces the original and hostile LU factorization for the condition number estimator by the more adequate QR factorization to enable software portability across various architectures. Relying on fine-grained computations, the novel task-based implementation is also capable of taking advantage of the identity structure of the matrix involved during the QDWH iterations, which decreases the overall algorithmic complexity. Furthermore, the artifactual synchronization points have been severely weakened compared to previous implementations, unveiling look-ahead opportunities for better hardware occupancy. The overall QDWH-based polar decomposition can then be represented as a directed acyclic graph (DAG), where nodes represent computational tasks and edges define the inter-task data dependencies. The StarPU dynamic runtime system is employed to traverse the DAG, to track the various data dependencies and to asynchronously schedule the computational tasks on the underlying hardware resources, resulting in an out-of-order task scheduling. Benchmarking experiments show significant improvements against existing state-of-the-art high performance implementations (i.e., Intel MKL and Elemental) for the polar decomposition on latest shared-memory vendors\\' systems (i.e., Intel Haswell/Broadwell/Knights Landing, NVIDIA K80/P100 GPUs and IBM Power8), while maintaining high numerical accuracy.

  19. Pointing Device Performance in Steering Tasks.

    Science.gov (United States)

    Senanayake, Ransalu; Goonetilleke, Ravindra S

    2016-06-01

    Use of touch-screen-based interactions is growing rapidly. Hence, knowing the maneuvering efficacy of touch screens relative to other pointing devices is of great importance in the context of graphical user interfaces. Movement time, accuracy, and user preferences of four pointing device settings were evaluated on a computer with 14 participants aged 20.1 ± 3.13 years. It was found that, depending on the difficulty of the task, the optimal settings differ for ballistic and visual control tasks. With a touch screen, resting the arm increased movement time for steering tasks. When both performance and comfort are considered, whether to use a mouse or a touch screen for person-computer interaction depends on the steering difficulty. Hence, a input device should be chosen based on the application, and should be optimized to match the graphical user interface. © The Author(s) 2016.

  20. Hybrid EEG-fNIRS Asynchronous Brain-Computer Interface for Multiple Motor Tasks.

    Directory of Open Access Journals (Sweden)

    Alessio Paolo Buccino

    Full Text Available Non-invasive Brain-Computer Interfaces (BCI have demonstrated great promise for neuroprosthetics and assistive devices. Here we aim to investigate methods to combine Electroencephalography (EEG and functional Near-Infrared Spectroscopy (fNIRS in an asynchronous Sensory Motor rhythm (SMR-based BCI. We attempted to classify 4 different executed movements, namely, Right-Arm-Left-Arm-Right-Hand-Left-Hand tasks. Previous studies demonstrated the benefit of EEG-fNIRS combination. However, since normally fNIRS hemodynamic response shows a long delay, we investigated new features, involving slope indicators, in order to immediately detect changes in the signals. Moreover, Common Spatial Patterns (CSPs have been applied to both EEG and fNIRS signals. 15 healthy subjects took part in the experiments and since 25 trials per class were available, CSPs have been regularized with information from the entire population of participants and optimized using genetic algorithms. The different features have been compared in terms of performance and the dynamic accuracy over trials shows that the introduced methods diminish the fNIRS delay in the detection of changes.

  1. Integrating human and machine intelligence in galaxy morphology classification tasks

    Science.gov (United States)

    Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl

    2018-06-01

    Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.

  2. From "rest" to language task: Task activation selects and prunes from broader resting-state network.

    Science.gov (United States)

    Doucet, Gaelle E; He, Xiaosong; Sperling, Michael R; Sharan, Ashwini; Tracy, Joseph I

    2017-05-01

    Resting-state networks (RSNs) show spatial patterns generally consistent with networks revealed during cognitive tasks. However, the exact degree of overlap between these networks has not been clearly quantified. Such an investigation shows promise for decoding altered functional connectivity (FC) related to abnormal language functioning in clinical populations such as temporal lobe epilepsy (TLE). In this context, we investigated the network configurations during a language task and during resting state using FC. Twenty-four healthy controls, 24 right and 24 left TLE patients completed a verb generation (VG) task and a resting-state fMRI scan. We compared the language network revealed by the VG task with three FC-based networks (seeding the left inferior frontal cortex (IFC)/Broca): two from the task (ON, OFF blocks) and one from the resting state. We found that, for both left TLE patients and controls, the RSN recruited regions bilaterally, whereas both VG-on and VG-off conditions produced more left-lateralized FC networks, matching more closely with the activated language network. TLE brings with it variability in both task-dependent and task-independent networks, reflective of atypical language organization. Overall, our findings suggest that our RSN captured bilateral activity, reflecting a set of prepotent language regions. We propose that this relationship can be best understood by the notion of pruning or winnowing down of the larger language-ready RSN to carry out specific task demands. Our data suggest that multiple types of network analyses may be needed to decode the association between language deficits and the underlying functional mechanisms altered by disease. Hum Brain Mapp 38:2540-2552, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Shoulder Strength Requirements for Upper Limb Functional Tasks: Do Age and Rotator Cuff Tear Status Matter?

    Science.gov (United States)

    Santago, Anthony C; Vidt, Meghan E; Li, Xiaotong; Tuohy, Christopher J; Poehling, Gary G; Freehill, Michael T; Saul, Katherine R

    2017-12-01

    Understanding upper limb strength requirements for daily tasks is imperative for early detection of strength loss that may progress to disability due to age or rotator cuff tear. We quantified shoulder strength requirements for 5 upper limb tasks performed by 3 groups: uninjured young adults and older adults, and older adults with a degenerative supraspinatus tear prior to repair. Musculoskeletal models were developed for each group representing age, sex, and tear-related strength losses. Percentage of available strength used was quantified for the subset of tasks requiring the largest amount of shoulder strength. Significant differences in strength requirements existed across tasks: upward reach 105° required the largest average strength; axilla wash required the largest peak strength. However, there were limited differences across participant groups. Older adults with and without a tear used a larger percentage of their shoulder elevation (p functional tasks to effectively detect early strength loss, which may lead to disability.

  4. CX: A Scalable, Robust Network for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Peter Cappello

    2002-01-01

    Full Text Available CX, a network-based computational exchange, is presented. The system's design integrates variations of ideas from other researchers, such as work stealing, non-blocking tasks, eager scheduling, and space-based coordination. The object-oriented API is simple, compact, and cleanly separates application logic from the logic that supports interprocess communication and fault tolerance. Computations, of course, run to completion in the presence of computational hosts that join and leave the ongoing computation. Such hosts, or producers, use task caching and prefetching to overlap computation with interprocessor communication. To break a potential task server bottleneck, a network of task servers is presented. Even though task servers are envisioned as reliable, the self-organizing, scalable network of n- servers, described as a sibling-connected height-balanced fat tree, tolerates a sequence of n-1 server failures. Tasks are distributed throughout the server network via a simple "diffusion" process. CX is intended as a test bed for research on automated silent auctions, reputation services, authentication services, and bonding services. CX also provides a test bed for algorithm research into network-based parallel computation.

  5. Application of computational fluid dynamics and pedestrian-behavior simulations to the design of task-ambient air-conditioning systems of a subway station

    Energy Technology Data Exchange (ETDEWEB)

    Fukuyo, Kazuhiro [Graduate School of Innovation and Technology Management, Faculty of Engineering, Yamaguchi University, Tokiwadai 2-16-1, Ube, Yamaguchi 755-8611 (Japan)

    2006-04-15

    The effects of task-ambient (TA) air-conditioning systems on the air-conditioning loads in a subway station and the thermal comfort of passengers were studied using computational fluid dynamics (CFD) and pedestrian-behavior simulations. The pedestrian-behavior model was applied to a standard subway station. Task areas were set up to match with crowdedness as predicted by the pedestrian-behavior simulations. Subsequently, a variety of TA air-conditioning systems were designed to selectively control the microclimate of the task areas. Their effects on the thermal environment in the station in winter were predicted by CFD. The results were compared with those of a conventional air-conditioning system and evaluated in relation to the thermal comfort of subway users and the air-conditioning loads. The comparison showed that TA air-conditioning systems improved thermal comfort and decreased air-conditioning loads. (author)

  6. An opportunity cost model of subjective effort and task performance

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  7. The effect of the external regulator's absence on children's speech use, manifested self-regulation, and task performance during learning tasks

    NARCIS (Netherlands)

    Agina, Adel M.; Agina, Adel Masaud; Kommers, Petrus A.M.; Steehouder, M.F.

    2011-01-01

    The present study was conducted to explore the effect of the absence of the external regulators on children’s use of speech (private/social), task performance, and self-regulation during learning tasks. A novel methodology was employed through a computer-based learning environment that proposed

  8. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  9. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  10. Practice makes perfect: familiarity of task determines success in solvable tasks for free-ranging dogs (Canis lupus familiaris).

    Science.gov (United States)

    Bhattacharjee, Debottam; Dasgupta, Sandipan; Biswas, Arpita; Deheria, Jayshree; Gupta, Shreya; Nikhil Dev, N; Udell, Monique; Bhadra, Anindita

    2017-07-01

    Domestic dogs' (Canis lupus familiaris) socio-cognitive faculties have made them highly sensitive to human social cues. While dogs often excel at understanding human communicative gestures, they perform comparatively poorly in problem-solving and physical reasoning tasks. This difference in their behaviour could be due to the lifestyle and intense socialization, where problem solving and physical cognition are less important than social cognition. Free-ranging dogs live in human-dominated environments, not under human supervision and are less socialized. Being scavengers, they often encounter challenges where problem solving is required in order to get access to food. We tested Indian street dogs in familiar and unfamiliar independent solvable tasks and quantified their persistence and dependence on a novel human experimenter, in addition to their success in solving a task. Our results indicate that free-ranging dogs succeeded and persisted more in the familiar task as compared to the unfamiliar one. They showed negligible amount of human dependence in the familiar task, but showed prolonged gazing and considerable begging behaviour to the human experimenter in the context of the unfamiliar task. Cognitive abilities of free-ranging dogs thus play a pivotal role in determining task-associated behaviours based on familiarity. In addition to that, these dogs inherently tend to socialize with and depend on humans, even if they are strangers. Our results also illustrate free-ranging dogs' low competence at physical cognitive tasks.

  11. Lessons Learned From the Development and Parameterization of a Computer Simulation Model to Evaluate Task Modification for Health Care Providers.

    Science.gov (United States)

    Kasaie, Parastu; David Kelton, W; Ancona, Rachel M; Ward, Michael J; Froehle, Craig M; Lyons, Michael S

    2018-02-01

    Computer simulation is a highly advantageous method for understanding and improving health care operations with a wide variety of possible applications. Most computer simulation studies in emergency medicine have sought to improve allocation of resources to meet demand or to assess the impact of hospital and other system policies on emergency department (ED) throughput. These models have enabled essential discoveries that can be used to improve the general structure and functioning of EDs. Theoretically, computer simulation could also be used to examine the impact of adding or modifying specific provider tasks. Doing so involves a number of unique considerations, particularly in the complex environment of acute care settings. In this paper, we describe conceptual advances and lessons learned during the design, parameterization, and validation of a computer simulation model constructed to evaluate changes in ED provider activity. We illustrate these concepts using examples from a study focused on the operational effects of HIV screening implementation in the ED. Presentation of our experience should emphasize the potential for application of computer simulation to study changes in health care provider activity and facilitate the progress of future investigators in this field. © 2017 by the Society for Academic Emergency Medicine.

  12. Asynchronous Task-Based Polar Decomposition on Single Node Manycore Architectures

    KAUST Repository

    Sukkari, Dalal E.; Ltaief, Hatem; Faverge, Mathieu; Keyes, David E.

    2017-01-01

    This paper introduces the first asynchronous, task-based formulation of the polar decomposition and its corresponding implementation on manycore architectures. Based on a formulation of the iterative QR dynamically-weighted Halley algorithm (QDWH) for the calculation of the polar decomposition, the proposed implementation replaces the original LU factorization for the condition number estimator by the more adequate QR factorization to enable software portability across various architectures. Relying on fine-grained computations, the novel task-based implementation is capable of taking advantage of the identity structure of the matrix involved during the QDWH iterations, which decreases the overall algorithmic complexity. Furthermore, the artifactual synchronization points have been weakened compared to previous implementations, unveiling look-ahead opportunities for better hardware occupancy. The overall QDWH-based polar decomposition can then be represented as a directed acyclic graph (DAG), where nodes represent computational tasks and edges define the inter-task data dependencies. The StarPU dynamic runtime system is employed to traverse the DAG, to track the various data dependencies and to asynchronously schedule the computational tasks on the underlying hardware resources, resulting in an out-of-order task scheduling. Benchmarking experiments show significant improvements against existing state-of-the-art high performance implementations for the polar decomposition on latest shared-memory vendors' systems, while maintaining numerical accuracy.

  13. Asynchronous Task-Based Polar Decomposition on Single Node Manycore Architectures

    KAUST Repository

    Sukkari, Dalal E.

    2017-09-29

    This paper introduces the first asynchronous, task-based formulation of the polar decomposition and its corresponding implementation on manycore architectures. Based on a formulation of the iterative QR dynamically-weighted Halley algorithm (QDWH) for the calculation of the polar decomposition, the proposed implementation replaces the original LU factorization for the condition number estimator by the more adequate QR factorization to enable software portability across various architectures. Relying on fine-grained computations, the novel task-based implementation is capable of taking advantage of the identity structure of the matrix involved during the QDWH iterations, which decreases the overall algorithmic complexity. Furthermore, the artifactual synchronization points have been weakened compared to previous implementations, unveiling look-ahead opportunities for better hardware occupancy. The overall QDWH-based polar decomposition can then be represented as a directed acyclic graph (DAG), where nodes represent computational tasks and edges define the inter-task data dependencies. The StarPU dynamic runtime system is employed to traverse the DAG, to track the various data dependencies and to asynchronously schedule the computational tasks on the underlying hardware resources, resulting in an out-of-order task scheduling. Benchmarking experiments show significant improvements against existing state-of-the-art high performance implementations for the polar decomposition on latest shared-memory vendors\\' systems, while maintaining numerical accuracy.

  14. Dynamic balance during walking adaptability tasks in individuals post-stroke.

    Science.gov (United States)

    Vistamehr, Arian; Balasubramanian, Chitralakshmi K; Clark, David J; Neptune, Richard R; Fox, Emily J

    2018-04-24

    Maintaining dynamic balance during community ambulation is a major challenge post-stroke. Community ambulation requires performance of steady-state level walking as well as tasks that require walking adaptability. Prior studies on balance control post-stroke have mainly focused on steady-state walking, but walking adaptability tasks have received little attention. The purpose of this study was to quantify and compare dynamic balance requirements during common walking adaptability tasks post-stroke and in healthy adults and identify differences in underlying mechanisms used for maintaining dynamic balance. Kinematic data were collected from fifteen individuals with post-stroke hemiparesis during steady-state forward and backward walking, obstacle negotiation, and step-up tasks. In addition, data from ten healthy adults provided the basis for comparison. Dynamic balance was quantified using the peak-to-peak range of whole-body angular-momentum in each anatomical plane during the paretic, nonparetic and healthy control single-leg-stance phase of the gait cycle. To understand differences in some of the key underlying mechanisms for maintaining dynamic balance, foot placement and plantarflexor muscle activation were examined. Individuals post-stroke had significant dynamic balance deficits in the frontal plane across most tasks, particularly during the paretic single-leg-stance. Frontal plane balance deficits were associated with wider paretic foot placement, elevated body center-of-mass, and lower soleus activity. Further, the obstacle negotiation task imposed a higher balance requirement, particularly during the trailing leg single-stance. Thus, improving paretic foot placement and ankle plantarflexor activity, particularly during obstacle negotiation, may be important rehabilitation targets to enhance dynamic balance during post-stroke community ambulation. Copyright © 2018. Published by Elsevier Ltd.

  15. Are women better than men at multi-tasking?

    OpenAIRE

    Stoet, Gijsbert; O’Connor, Daryl B.; Conner, Mark; Laws, Keith R.

    2013-01-01

    Background: There seems to be a common belief that women are better in multi-tasking than men, but there is practically no scientific research on this topic. Here, we tested whether women have better multi-tasking skills than men.\\ensuremath\\ensuremath Methods: In Experiment 1, we compared performance of 120 women and 120 men in a computer-based task-switching paradigm. In Experiment 2, we compared a different group of 47 women and 47 men on "paper-and-pencil" multi-tasking tests.\\ensuremath\\...

  16. Subjective and objective quantification of physician's workload and performance during radiation therapy planning tasks.

    Science.gov (United States)

    Mazur, Lukasz M; Mosaly, Prithima R; Hoyle, Lesley M; Jones, Ellen L; Marks, Lawrence B

    2013-01-01

    To quantify, and compare, workload for several common physician-based treatment planning tasks using objective and subjective measures of workload. To assess the relationship between workload and performance to define workload levels where performance could be expected to decline. Nine physicians performed the same 3 tasks on each of 2 cases ("easy" vs "hard"). Workload was assessed objectively throughout the tasks (via monitoring of pupil size and blink rate), and subjectively at the end of each case (via National Aeronautics and Space Administration Task Load Index; NASA-TLX). NASA-TLX assesses the 6 dimensions (mental, physical, and temporal demands, frustration, effort, and performance); scores > or ≈ 50 are associated with reduced performance in other industries. Performance was measured using participants' stated willingness to approve the treatment plan. Differences in subjective and objective workload between cases, tasks, and experience were assessed using analysis of variance (ANOVA). The correlation between subjective and objective workload measures were assessed via the Pearson correlation test. The relationships between workload and performance measures were assessed using the t test. Eighteen case-wise and 54 task-wise assessments were obtained. Subjective NASA-TLX scores (P .1), were significantly lower for the easy vs hard case. Most correlations between the subjective and objective measures were not significant, except between average blink rate and NASA-TLX scores (r = -0.34, P = .02), for task-wise assessments. Performance appeared to decline at NASA-TLX scores of ≥55. The NASA-TLX may provide a reasonable method to quantify subjective workload for broad activities, and objective physiologic eye-based measures may be useful to monitor workload for more granular tasks within activities. The subjective and objective measures, as herein quantified, do not necessarily track each other, and more work is needed to assess their utilities. From a

  17. A complex network approach to cloud computing

    International Nuclear Information System (INIS)

    Travieso, Gonzalo; Ruggiero, Carlos Antônio; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2016-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the clients’ tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlaid by Erdős–Rényi (ER) and Barabási-Albert (BA) topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of the cost of communication between the client and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter, the ER topology provides better performance than the BA for smaller average degrees and opposite behaviour for larger average degrees. With respect to cost, smaller values are found in the BA topology irrespective of the average degree. In addition, we also verified that it is easier to find good servers in ER than in BA networks. Surprisingly, balance and cost are not too much affected by the presence of communities. However, for a well-defined community network, we found that it is important to assign each server to a different community so as to achieve better performance. (paper: interdisciplinary statistical mechanics )

  18. A Heuristic Task Scheduling Algorithm for Heterogeneous Virtual Clusters

    Directory of Open Access Journals (Sweden)

    Weiwei Lin

    2016-01-01

    Full Text Available Cloud computing provides on-demand computing and storage services with high performance and high scalability. However, the rising energy consumption of cloud data centers has become a prominent problem. In this paper, we first introduce an energy-aware framework for task scheduling in virtual clusters. The framework consists of a task resource requirements prediction module, an energy estimate module, and a scheduler with a task buffer. Secondly, based on this framework, we propose a virtual machine power efficiency-aware greedy scheduling algorithm (VPEGS. As a heuristic algorithm, VPEGS estimates task energy by considering factors including task resource demands, VM power efficiency, and server workload before scheduling tasks in a greedy manner. We simulated a heterogeneous VM cluster and conducted experiment to evaluate the effectiveness of VPEGS. Simulation results show that VPEGS effectively reduced total energy consumption by more than 20% without producing large scheduling overheads. With the similar heuristic ideology, it outperformed Min-Min and RASA with respect to energy saving by about 29% and 28%, respectively.

  19. Opening the Frey/Osborne Black Box: Which Tasks of a Job are Susceptible to Computerization?

    OpenAIRE

    Brandes, Philipp; Wattenhofer, Roger

    2016-01-01

    In their seminal paper, Frey and Osborne quantified the automation of jobs, by assigning each job in the O*NET database a probability to be automated. In this paper, we refine their results in the following way: Every O*NET job consists of a set of tasks, and these tasks can be related. We use a linear program to assign probabilities to tasks, such that related tasks have a similar probability and the tasks can explain the computerization probability of a job. Analyzing jobs on the level of t...

  20. A formalization of computational trust

    NARCIS (Netherlands)

    Güven - Ozcelebi, C.; Holenderski, M.J.; Ozcelebi, T.; Lukkien, J.J.

    2018-01-01

    Computational trust aims to quantify trust and is studied by many disciplines including computer science, social sciences and business science. We propose a formal computational trust model, including its parameters and operations on these parameters, as well as a step by step guide to compute trust

  1. Task-focused modeling in automated agriculture

    Science.gov (United States)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  2. Applying genetic algorithms for programming manufactoring cell tasks

    Directory of Open Access Journals (Sweden)

    Efredy Delgado

    2005-05-01

    Full Text Available This work was aimed for developing computational intelligence for scheduling a manufacturing cell's tasks, based manily on genetic algorithms. The manufacturing cell was modelled as beign a production-line; the makespan was calculated by using heuristics adapted from several libraries for genetic algorithms computed in C++ builder. Several problems dealing with small, medium and large list of jobs and machinery were resolved. The results were compared with other heuristics. The approach developed here would seem to be promising for future research concerning scheduling manufacturing cell tasks involving mixed batches.

  3. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  4. Computation of Buffer Capacities for Throughput Constrained and Data Dependent Inter-Task Communication

    NARCIS (Netherlands)

    Wiggers, M.H.; Bekooij, Marco Jan Gerrit; Bekooij, Marco J.G.; Smit, Gerardus Johannes Maria

    2008-01-01

    Streaming applications are often implemented as task graphs. Currently, techniques exist to derive buffer capacities that guarantee satisfaction of a throughput constraint for task graphs in which the inter-task communication is data-independent, i.e. the amount of data produced and consumed is

  5. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    Science.gov (United States)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

  6. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task

  7. Modified CC-LR algorithm with three diverse feature sets for motor imagery tasks classification in EEG based brain-computer interface.

    Science.gov (United States)

    Siuly; Li, Yan; Paul Wen, Peng

    2014-03-01

    Motor imagery (MI) tasks classification provides an important basis for designing brain-computer interface (BCI) systems. If the MI tasks are reliably distinguished through identifying typical patterns in electroencephalography (EEG) data, a motor disabled people could communicate with a device by composing sequences of these mental states. In our earlier study, we developed a cross-correlation based logistic regression (CC-LR) algorithm for the classification of MI tasks for BCI applications, but its performance was not satisfactory. This study develops a modified version of the CC-LR algorithm exploring a suitable feature set that can improve the performance. The modified CC-LR algorithm uses the C3 electrode channel (in the international 10-20 system) as a reference channel for the cross-correlation (CC) technique and applies three diverse feature sets separately, as the input to the logistic regression (LR) classifier. The present algorithm investigates which feature set is the best to characterize the distribution of MI tasks based EEG data. This study also provides an insight into how to select a reference channel for the CC technique with EEG signals considering the anatomical structure of the human brain. The proposed algorithm is compared with eight of the most recently reported well-known methods including the BCI III Winner algorithm. The findings of this study indicate that the modified CC-LR algorithm has potential to improve the identification performance of MI tasks in BCI systems. The results demonstrate that the proposed technique provides a classification improvement over the existing methods tested. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. Grammatical production deficits in PPA: Relating narrative and structured task performance

    Directory of Open Access Journals (Sweden)

    Elena Barbieri

    2015-05-01

    Full Text Available Introduction Grammatical production impairments in primary progressive aphasia (PPA have been investigated using structured language tasks and analysis of narrative language samples (for review see Thompson & Mack, 2014; Wilson et al., 2012. However, little research has examined the relationship between them in PPA. Whereas structured tasks often assess production accuracy at different levels of syntactic complexity (e.g., Thompson et al., 2013, narrative measures typically assess overall lexical and grammatical usage (e.g., % grammatical sentences; noun-to-verb ratio, with lesser emphasis on complexity. The present study investigated the relationship between narrative measures of grammatical production and performance on structured language tests in the domains of syntax, verb morphology, and verb-argument structure (VAS. Materials and methods Data from 101 individuals with PPA were included. Participants completed a test battery including the Northwestern Assessment of Verbs and Sentences (NAVS, Thompson, 2011, the Northwestern Assessment of Verb Inflection (NAVI, Lee & Thompson, experimental version and the Northwestern Anagram Test (NAT, Thompson, Weintraub, & Mesulam, 2012. Grammatical production deficits were quantified as follows: for syntax, accuracy of non-canonical sentence production on the NAVS Sentence Production Priming Test (SPPT and the NAT; for morphology, the accuracy on finite verbs on the NAVI; for VAS, the accuracy of sentences produced with 2- and 3-argument verbs on the NAVS Argument Structure Production Test (ASPT. Cinderella narrative samples were analyzed using the Northwestern Narrative Language Analysis system (e.g., Thompson et al., 2012. For syntax, complexity was measured by the ratio of syntactically complex to simple sentences produced, whereas accuracy was indexed by computing the proportion of words with a locally grammatical lexical category. Morphological complexity was measured by mean number of verb

  9. Characterizing and Mitigating Work Time Inflation in Task Parallel Programs

    Directory of Open Access Journals (Sweden)

    Stephen L. Olivier

    2013-01-01

    Full Text Available Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems. Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.

  10. Experiment with expert system guidance of an engineering analysis task

    International Nuclear Information System (INIS)

    Ransom, V.H.; Fink, R.K.; Callow, R.A.

    1986-01-01

    An experiment is being conducted in which expert systems are used to guide the performance of an engineering analysis task. The task chosen for experimentation is the application of a large thermal hydraulic transient simulation computer code. The expectation from this work is that the expert system will result in an improved analytical result with a reduction in the amount of human labor and expertise required. The code associated functions of model formulation, data input, code execution, and analysis of the computed output have all been identified as candidate tasks that could benefit from the use of expert systems. Expert system modules have been built for the model building and data input task. Initial results include the observation that human expectations of an intelligent environment rapidly escalate and structured or stylized tasks that are tolerated in the unaided system are frustrating within the intelligent environment

  11. Investigating the Appropriateness of the TACOM Measure: Application to the Complexity of Proceduralized Tasks for High Speed Train Drivers

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea; Ko, Jong Hyun

    2010-01-01

    According to wide-spread experience in many industries, a procedure is one of the most effective countermeasures to reduce the possibility of human related problems. Unfortunately, a systematic framework to evaluate the complexity of procedural tasks seems to be very scant. For this reason, the TACOM measure, which can quantify the complexity of procedural tasks, has been developed. In this study, the appropriateness of the TACOM measure is investigated by comparing TACOM scores regarding the procedural tasks of high speed train drivers with the associated workload scores measured by the NASA-TLX technique. As a result, it is observed that there is a meaningful correlation between the TACOM scores and the associated NASA-TLX scores. Therefore, it is expected that the TACOM measure can properly quantify the complexity of procedural tasks

  12. Investigating the Appropriateness of the TACOM Measure: Application to the Complexity of Proceduralized Tasks for High Speed Train Drivers

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ko, Jong Hyun [Nuclear Engineering and Technology Institute, Daejeon (Korea, Republic of)

    2010-02-15

    According to wide-spread experience in many industries, a procedure is one of the most effective countermeasures to reduce the possibility of human related problems. Unfortunately, a systematic framework to evaluate the complexity of procedural tasks seems to be very scant. For this reason, the TACOM measure, which can quantify the complexity of procedural tasks, has been developed. In this study, the appropriateness of the TACOM measure is investigated by comparing TACOM scores regarding the procedural tasks of high speed train drivers with the associated workload scores measured by the NASA-TLX technique. As a result, it is observed that there is a meaningful correlation between the TACOM scores and the associated NASA-TLX scores. Therefore, it is expected that the TACOM measure can properly quantify the complexity of procedural tasks

  13. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    Science.gov (United States)

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements. © 2011 IEEE

  14. Task Balanced Workflow Scheduling Technique considering Task Processing Rate in Spot Market

    Directory of Open Access Journals (Sweden)

    Daeyong Jung

    2014-01-01

    Full Text Available Recently, the cloud computing is a computing paradigm that constitutes an advanced computing environment that evolved from the distributed computing. And the cloud computing provides acquired computing resources in a pay-as-you-go manner. For example, Amazon EC2 offers the Infrastructure-as-a-Service (IaaS instances in three different ways with different price, reliability, and various performances of instances. Our study is based on the environment using spot instances. Spot instances can significantly decrease costs compared to reserved and on-demand instances. However, spot instances give a more unreliable environment than other instances. In this paper, we propose the workflow scheduling scheme that reduces the out-of-bid situation. Consequently, the total task completion time is decreased. The simulation results reveal that, compared to various instance types, our scheme achieves performance improvements in terms of an average combined metric of 12.76% over workflow scheme without considering the processing rate. However, the cost in our scheme is higher than an instance with low performance and is lower than an instance with high performance.

  15. Neutron radiography and X-ray computed tomography for quantifying weathering and water uptake processes inside porous limestone used as building material

    International Nuclear Information System (INIS)

    Dewanckele, J.; De Kock, T.; Fronteau, G.; Derluyn, H.; Vontobel, P.; Dierick, M.; Van Hoorebeke, L.; Jacobs, P.; Cnudde, V.

    2014-01-01

    Euville and Savonnières limestones were weathered by acid test and this resulted in the formation of a gypsum crust. In order to characterize the crystallization pattern and the evolution of the pore structure below the crust, a combination of high resolution X-ray computed tomography and SEM–EDS was used. A time lapse sequence of the changing pore structure in both stones was obtained and afterwards quantified by using image analysis. The difference in weathering of both stones by the same process could be explained by the underlying microstructure and texture. Because water and moisture play a crucial role in the weathering processes, water uptake in weathered and non-weathered samples was characterized based on neutron radiography. In this way the water uptake was both visualized and quantified in function of the height of the sample and in function of time. In general, the formation of a gypsum crust on limestone slows down the initial water uptake in the materials. - Highlights: • Time lapse sequence in 3D of changing pore structures inside limestone • A combination of X-ray CT, SEM and neutron radiography was used. • Quantification of water content in function of time, height and weathering • Characterization of weathering processes due to gypsum crystallization

  16. Task-oriented training with computer gaming in people with rheumatoid arthritisor osteoarthritis of the hand: study protocol of a randomized controlled pilot trial.

    Science.gov (United States)

    Srikesavan, Cynthia Swarnalatha; Shay, Barbara; Robinson, David B; Szturm, Tony

    2013-03-09

    Significant restriction in the ability to participate in home, work and community life results from pain, fatigue, joint damage, stiffness and reduced joint range of motion and muscle strength in people with rheumatoid arthritis or osteoarthritis of the hand. With modest evidence on the therapeutic effectiveness of conventional hand exercises, a task-oriented training program via real life object manipulations has been developed for people with arthritis. An innovative, computer-based gaming platform that allows a broad range of common objects to be seamlessly transformed into therapeutic input devices through instrumentation with a motion-sense mouse has also been designed. Personalized objects are selected to target specific training goals such as graded finger mobility, strength, endurance or fine/gross dexterous functions. The movements and object manipulation tasks that replicate common situations in everyday living will then be used to control and play any computer game, making practice challenging and engaging. The ongoing study is a 6-week, single-center, parallel-group, equally allocated and assessor-blinded pilot randomized controlled trial. Thirty people with rheumatoid arthritis or osteoarthritis affecting the hand will be randomized to receive either conventional hand exercises or the task-oriented training. The purpose is to determine a preliminary estimation of therapeutic effectiveness and feasibility of the task-oriented training program. Performance based and self-reported hand function, and exercise compliance are the study outcomes. Changes in outcomes (pre to post intervention) within each group will be assessed by paired Student t test or Wilcoxon signed-rank test and between groups (control versus experimental) post intervention using unpaired Student t test or Mann-Whitney U test. The study findings will inform decisions on the feasibility, safety and completion rate and will also provide preliminary data on the treatment effects of the task

  17. Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance

    Directory of Open Access Journals (Sweden)

    René Riedl

    2013-01-01

    Full Text Available In today’s society, as computers, the Internet, and mobile phones pervade almost every corner of life, the impact of Information and Communication Technologies (ICT on humans is dramatic. The use of ICT, however, may also have a negative side. Human interaction with technology may lead to notable stress perceptions, a phenomenon referred to as technostress. An investigation of the literature reveals that computer users’ gender has largely been ignored in technostress research, treating users as “gender-neutral.” To close this significant research gap, we conducted a laboratory experiment in which we investigated users’ physiological reaction to the malfunctioning of technology. Based on theories which explain that men, in contrast to women, are more sensitive to “achievement stress,” we predicted that male users would exhibit higher levels of stress than women in cases of system breakdown during the execution of a human-computer interaction task under time pressure, if compared to a breakdown situation without time pressure. Using skin conductance as a stress indicator, the hypothesis was confirmed. Thus, this study shows that user gender is crucial to better understanding the influence of stress factors such as computer malfunctions on physiological stress reactions.

  18. Affordances and synchronization in collective creative construction tasks

    DEFF Research Database (Denmark)

    Tylén, Kristian; Fusaroli, Riccardo

    What does it mean to cooperate? How do we share meanings and actions in order to reach a common goal? In this paper we explore the relation between cooperative coordination and heart rate. We argue that in cooperative contexts participants synchronize their heart rhythms according to two factors......: the affordances of the task at hand and the gradual consolidation of collaborative practices. Six groups of participants were instructed to construct LEGO models of six abstract notions (“responsibility”, “knowledge”, “justice” etc.), both individually and in groups. We combine video analysis and heart rate...... measurements and employ recurrence analysis techniques to quantify the mutual adaptability of heart rates among the participants in the different tasks. We show that during individual tasks individual heart rates synchronize both within and between groups (but not with controls) plausibly due...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. Classifying and quantifying human error in routine tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Pederson, O.M.; Rasmussen, J.; Carnino, A.; Gagnolet, P.; Griffon, M.; Mancini, G.

    1982-01-01

    This paper results from the work of the OECD/NEA-CSNI Group of Experts on Human Error Data and Assessment. It proposes a classification system (or taxonomy) for use in reporting events involving human malfunction, especially those occurring during the execution of routine tasks. A set of data collection sheets based on this taxonomy has been designed. They include the information needed in order to ensure adequate quality and coherence of the raw data. The sources from which the various data should be obtainable are identified, as are the persons who should analyze them. Improving data collection systems is an iterative process. Therefore Group members are currently making trial applications of the taxonomy to previously analysed real incidents. Results from the initial round of trials are presented and discussed

  1. Quantifying and mapping spatial variability in simulated forest plots

    Science.gov (United States)

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  2. Quantifying information leakage of randomized protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Legay, Axel; Malacaria, Pasquale

    2015-01-01

    The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model deterministic and probabilistic systems. By using a methodology generalizing the lattice of information approach we model refined attackers...... capable to observe the internal behavior of the system, and quantify the information leakage of such systems. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and non-timed attacks...

  3. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  4. A nonlinear dynamics of trunk kinematics during manual lifting tasks.

    Science.gov (United States)

    Khalaf, Tamer; Karwowski, Waldemar; Sapkota, Nabin

    2015-01-01

    Human responses at work may exhibit nonlinear properties where small changes in the initial task conditions can lead to large changes in system behavior. Therefore, it is important to study such nonlinearity to gain a better understanding of human performance under a variety of physical, perceptual, and cognitive tasks conditions. The main objective of this study was to investigate whether the human trunk kinematics data during a manual lifting task exhibits nonlinear behavior in terms of determinist chaos. Data related to kinematics of the trunk with respect to the pelvis were collected using Industrial Lumbar Motion Monitor (ILMM), and analyzed applying the nonlinear dynamical systems methodology. Nonlinear dynamics quantifiers of Lyapunov exponents and Kaplan-Yorke dimensions were calculated and analyzed under different task conditions. The study showed that human trunk kinematics during manual lifting exhibits chaotic behavior in terms of trunk sagittal angular displacement, velocity and acceleration. The findings support the importance of accounting for nonlinear dynamical properties of biomechanical responses to lifting tasks.

  5. Quantifying narrative ability in autism spectrum disorder: a computational linguistic analysis of narrative coherence.

    Science.gov (United States)

    Losh, Molly; Gordon, Peter C

    2014-12-01

    Autism is a neurodevelopmental disorder characterized by serious difficulties with the social use of language, along with impaired social functioning and ritualistic/repetitive behaviors (American Psychiatric Association in Diagnostic and statistical manual of mental disorders: DSM-5, 5th edn. American Psychiatric Association, Arlington, 2013). While substantial heterogeneity exists in symptom expression, impairments in language discourse skills, including narrative (or storytelling), are universally observed in autism (Tager-Flusberg et al. in Handbook on autism and pervasive developmental disorders, 3rd edn. Wiley, New York, pp 335-364, 2005). This study applied a computational linguistic tool, Latent Semantic Analysis (LSA), to objectively characterize narrative performance in high-functioning individuals with autism and typically-developing controls, across two different narrative contexts that differ in the interpersonal and cognitive demands placed on the narrator. Results indicated that high-functioning individuals with autism produced narratives comparable in semantic content to those produced by controls when narrating from a picture book, but produced narratives diminished in semantic quality in a more demanding narrative recall task. This pattern is similar to that detected from analyses of hand-coded picture book narratives in prior research, and extends findings to an additional narrative context that proves particularly challenging for individuals with autism. Results are discussed in terms of the utility of LSA as a quantitative, objective, and efficient measure of narrative ability.

  6. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Science.gov (United States)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  7. Dashboard task monitor for managing ATLAS user analysis on the grid

    International Nuclear Information System (INIS)

    Sargsyan, L; Andreeva, J; Karavakis, E; Saiz, P; Tuckett, D; Jha, M; Kokoszkiewicz, L; Schovancova, J

    2014-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  8. A comparative critical analysis of modern task-parallel runtimes.

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Kyle Bruce; Stark, Dylan; Murphy, Richard C.

    2012-12-01

    The rise in node-level parallelism has increased interest in task-based parallel runtimes for a wide array of application areas. Applications have a wide variety of task spawning patterns which frequently change during the course of application execution, based on the algorithm or solver kernel in use. Task scheduling and load balance regimes, however, are often highly optimized for specific patterns. This paper uses four basic task spawning patterns to quantify the impact of specific scheduling policy decisions on execution time. We compare the behavior of six publicly available tasking runtimes: Intel Cilk, Intel Threading Building Blocks (TBB), Intel OpenMP, GCC OpenMP, Qthreads, and High Performance ParalleX (HPX). With the exception of Qthreads, the runtimes prove to have schedulers that are highly sensitive to application structure. No runtime is able to provide the best performance in all cases, and those that do provide the best performance in some cases, unfortunately, provide extremely poor performance when application structure does not match the schedulers assumptions.

  9. 49 CFR 236.1043 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... classroom, simulator, computer-based, hands-on, or other formally structured training designed to impart the... 49 Transportation 4 2010-10-01 2010-10-01 false Task analysis and basic requirements. 236.1043...

  10. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    Science.gov (United States)

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  11. Quantifying antiviral activity optimizes drug combinations against hepatitis C virus infection

    Energy Technology Data Exchange (ETDEWEB)

    Koizumi, Yoshiki [School of Medicine, College of Medical, Pharmaceutical and Health Sciences, Kanazawa University, Ishikawa, Japan; Nakajim, Syo [Department of Virology II, National Institute of Infectious Diseases, Tokyo, Japan; Department of Applied Biological Sciences, Faculty of Science and Technology, Tokyo University of Sciences, Chiba, J; Ohash, Hirofumi [Department of Virology II, National Institute of Infectious Diseases, Tokyo, Japan: Department of Applied Biological Sciences, Faculty of Science and Technology, Tokyo University of Sciences, Chiba, J; Tanaka, Yasuhito [Department of Virology and Liver Unit, Nagoya City University Graduate School of Medicinal Sciences, Nagoya, Japan; Wakita, Takaji [Department of Virology II, National Institute of Infectious Diseases, Tokyo, Japan; Perelson, Alan S. [Los Alamos National Laboratory; Iwami, Shingo [Department of Biology, Faculty of Sciences, Kyushu University, Fukuoka, Japan: PRESTO, JST, Saitama, Japan: CREST, JST, Saitama, Japan; Watashi, Koichi [Department of Virology II, National Institute of Infectious Diseases, Tokyo, Japan: Department of Applied Biological Sciences, Faculty of Science and Technology, Tokyo University of Sciences, Chiba, J

    2016-03-21

    Cell culture study combing a mathematical model and computer simulation quantifies the anti-hepatitis C virus drug efficacy at any concentrations and any combinations in preclinical settings, and can obtain rich basic evidences for selecting optimal treatments prior to costly clinical trials.

  12. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  13. Classification effects of real and imaginary movement selective attention tasks on a P300-based brain-computer interface

    Science.gov (United States)

    Salvaris, Mathew; Sepulveda, Francisco

    2010-10-01

    Brain-computer interfaces (BCIs) rely on various electroencephalography methodologies that allow the user to convey their desired control to the machine. Common approaches include the use of event-related potentials (ERPs) such as the P300 and modulation of the beta and mu rhythms. All of these methods have their benefits and drawbacks. In this paper, three different selective attention tasks were tested in conjunction with a P300-based protocol (i.e. the standard counting of target stimuli as well as the conduction of real and imaginary movements in sync with the target stimuli). The three tasks were performed by a total of 10 participants, with the majority (7 out of 10) of the participants having never before participated in imaginary movement BCI experiments. Channels and methods used were optimized for the P300 ERP and no sensory-motor rhythms were explicitly used. The classifier used was a simple Fisher's linear discriminant. Results were encouraging, showing that on average the imaginary movement achieved a P300 versus No-P300 classification accuracy of 84.53%. In comparison, mental counting, the standard selective attention task used in previous studies, achieved 78.9% and real movement 90.3%. Furthermore, multiple trial classification results were recorded and compared, with real movement reaching 99.5% accuracy after four trials (12.8 s), imaginary movement reaching 99.5% accuracy after five trials (16 s) and counting reaching 98.2% accuracy after ten trials (32 s).

  14. A computational study of whole-brain connectivity in resting state and task fMRI

    Science.gov (United States)

    Goparaju, Balaji; Rana, Kunjan D.; Calabro, Finnegan J.; Vaina, Lucia Maria

    2014-01-01

    Background We compared the functional brain connectivity produced during resting-state in which subjects were not actively engaged in a task with that produced while they actively performed a visual motion task (task-state). Material/Methods In this paper we employed graph-theoretical measures and network statistics in novel ways to compare, in the same group of human subjects, functional brain connectivity during resting-state fMRI with brain connectivity during performance of a high level visual task. We performed a whole-brain connectivity analysis to compare network statistics in resting and task states among anatomically defined Brodmann areas to investigate how brain networks spanning the cortex changed when subjects were engaged in task performance. Results In the resting state, we found strong connectivity among the posterior cingulate cortex (PCC), precuneus, medial prefrontal cortex (MPFC), lateral parietal cortex, and hippocampal formation, consistent with previous reports of the default mode network (DMN). The connections among these areas were strengthened while subjects actively performed an event-related visual motion task, indicating a continued and strong engagement of the DMN during task processing. Regional measures such as degree (number of connections) and betweenness centrality (number of shortest paths), showed that task performance induces stronger inter-regional connections, leading to a denser processing network, but that this does not imply a more efficient system as shown by the integration measures such as path length and global efficiency, and from global measures such as small-worldness. Conclusions In spite of the maintenance of connectivity and the “hub-like” behavior of areas, our results suggest that the network paths may be rerouted when performing the task condition. PMID:24947491

  15. Resistance to change and resurgence in humans engaging in a computer task.

    Science.gov (United States)

    Kuroda, Toshikazu; Cançado, Carlos R X; Podlesnik, Christopher A

    2016-04-01

    The relation between persistence, as measured by resistance to change, and resurgence has been examined with nonhuman animals but not systematically with humans. The present study examined persistence and resurgence with undergraduate students engaging in a computer task for points exchangeable for money. In Phase 1, a target response was maintained on a multiple variable-interval (VI) 15-s (Rich) VI 60-s (Lean) schedule of reinforcement. In Phase 2, the target response was extinguished while an alternative response was reinforced at equal rates in both schedule components. In Phase 3, the target and the alternative responses were extinguished. In an additional test of persistence (Phase 4), target responding was reestablished as in Phase 1 and then disrupted by access to videos in both schedule components. In Phases 2 and 4, target responding was more persistent in the Rich than in the Lean component. Also, resurgence generally was greater in the Rich than in the Lean component in Phase 3. The present findings with humans extend the generality of those obtained with nonhuman animals showing that higher reinforcement rates produce both greater persistence and resurgence, and suggest that common processes underlie response persistence and relapse. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Information criteria for quantifying loss of reversibility in parallelized KMC

    Energy Technology Data Exchange (ETDEWEB)

    Gourgoulias, Konstantinos, E-mail: gourgoul@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Rey-Bellet, Luc, E-mail: luc@math.umass.edu

    2017-01-01

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot be computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.

  17. Selective Inhibition and Naming Performance in Semantic Blocking, Picture-Word Interference, and Color-Word Stroop Tasks

    Science.gov (United States)

    Shao, Zeshu; Roelofs, Ardi; Martin, Randi C.; Meyer, Antje S.

    2015-01-01

    In 2 studies, we examined whether explicit distractors are necessary and sufficient to evoke selective inhibition in 3 naming tasks: the semantic blocking, picture-word interference, and color-word Stroop task. Delta plots were used to quantify the size of the interference effects as a function of reaction time (RT). Selective inhibition was…

  18. A study on quantification of the information flow and effectiveness of information aids for diagnosis tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    2004-02-01

    terms of time-to-task completion. As a study on human performance model, this paper evaluates the effectiveness of information aiding types based on the operator strategies for diagnosis tasks at NPPs, especially for fault identification. To evaluate the effectiveness, four aiding types which are available for supporting MCR operator's diagnosis were selected for the experiment: no aid, alarm, hypothesis with certainty factor, and hypothesis with certainty factor + expected symptom patterns. The main features of the aiding types were elicited from typical direct operator support systems for MCR operators. An experiment was conducted for 24 graduate students and subject performances were analyzed according to the strategies the subjects used in problem-solving. The experimental results show that the effect of the information aiding types on subject performance can be changed according to subject strategy. Finally, as the analysis of operator support system for MCR operators, this paper suggests a method for the quantitative evaluation of NPP decision support systems (DSSs). In this approach, the dynamic aspects of DSSs are first defined. Then, the hierarchical structure of the evaluation criteria for dynamic aspects of DSS is provided. For quantitative evaluation, the relative weights of the criteria are computed using analytic hierarchy process (AHP) to gain and aggregate the priority of the components. The criteria at the lowest level are quantified by simple numerical expressions and questionnaires which are developed to describe the characteristics of the criteria. Finally, in order to demonstrate the feasibility of this proposition, one case study is performed for the fault diagnosis module of OASYS"T"M (On-Line Operator Aid SYStem for Nuclear Power Plant), which is an operator support system developed at KAIST. In conclusion, this paper describes the practical implications in the design of MCR operator support systems

  19. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  20. Ge(Li) data reduction using small computers

    Science.gov (United States)

    Mcdermott, W. E.

    1972-01-01

    The advantages and limitations of using a small computer to analyze Ge(Li) radiation spectra are studied. The computer has to: (1) find the spectrum peaks, (2) determine the count rate in the photopeaks, and (3) relate the count rate to known gamma transitions to find the amount of each radionuclide present. Results show that tasks one and two may be done by the computer but task three must be done by an experimenter or a larger computer.

  1. Partitioning the Metabolic Cost of Human Running: A Task-by-Task Approach

    Science.gov (United States)

    Arellano, Christopher J.; Kram, Rodger

    2014-01-01

    Compared with other species, humans can be very tractable and thus an ideal “model system” for investigating the metabolic cost of locomotion. Here, we review the biomechanical basis for the metabolic cost of running. Running has been historically modeled as a simple spring-mass system whereby the leg acts as a linear spring, storing, and returning elastic potential energy during stance. However, if running can be modeled as a simple spring-mass system with the underlying assumption of perfect elastic energy storage and return, why does running incur a metabolic cost at all? In 1980, Taylor et al. proposed the “cost of generating force” hypothesis, which was based on the idea that elastic structures allow the muscles to transform metabolic energy into force, and not necessarily mechanical work. In 1990, Kram and Taylor then provided a more explicit and quantitative explanation by demonstrating that the rate of metabolic energy consumption is proportional to body weight and inversely proportional to the time of foot-ground contact for a variety of animals ranging in size and running speed. With a focus on humans, Kram and his colleagues then adopted a task-by-task approach and initially found that the metabolic cost of running could be “individually” partitioned into body weight support (74%), propulsion (37%), and leg-swing (20%). Summing all these biomechanical tasks leads to a paradoxical overestimation of 131%. To further elucidate the possible interactions between these tasks, later studies quantified the reductions in metabolic cost in response to synergistic combinations of body weight support, aiding horizontal forces, and leg-swing-assist forces. This synergistic approach revealed that the interactive nature of body weight support and forward propulsion comprises ∼80% of the net metabolic cost of running. The task of leg-swing at most comprises ∼7% of the net metabolic cost of running and is independent of body weight support and forward

  2. Quantifying Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Quantifying Matter explains how scientists learned to measure matter and quantify some of its most fascinating and useful properties. It presents many of the most important intellectual achievements and technical developments that led to the scientific interpretation of substance. Complete with full-color photographs, this exciting new volume describes the basic characteristics and properties of matter. Chapters include:. -Exploring the Nature of Matter. -The Origin of Matter. -The Search for Substance. -Quantifying Matter During the Scientific Revolution. -Understanding Matter's Electromagnet

  3. IGT-Open: An open-source, computerized version of the Iowa Gambling Task.

    Science.gov (United States)

    Dancy, Christopher L; Ritter, Frank E

    2017-06-01

    The Iowa Gambling Task (IGT) is commonly used to understand the processes involved in decision-making. Though the task was originally run without a computer, using a computerized version of the task has become typical. These computerized versions of the IGT are useful, because they can make the task more standardized across studies and allow for the task to be used in environments where a physical version of the task may be difficult or impossible to use (e.g., while collecting brain imaging data). Though these computerized versions of the IGT have been useful for experimentation, having multiple software implementations of the task could present reliability issues. We present an open-source software version of the Iowa Gambling Task (called IGT-Open) that allows for millisecond visual presentation accuracy and is freely available to be used and modified. This software has been used to collect data from human subjects and also has been used to run model-based simulations with computational process models developed to run in the ACT-R architecture.

  4. A hybrid approach to quantify software reliability in nuclear safety systems

    International Nuclear Information System (INIS)

    Arun Babu, P.; Senthil Kumar, C.; Murali, N.

    2012-01-01

    Highlights: ► A novel method to quantify software reliability using software verification and mutation testing in nuclear safety systems. ► Contributing factors that influence software reliability estimate. ► Approach to help regulators verify the reliability of safety critical software system during software licensing process. -- Abstract: Technological advancements have led to the use of computer based systems in safety critical applications. As computer based systems are being introduced in nuclear power plants, effective and efficient methods are needed to ensure dependability and compliance to high reliability requirements of systems important to safety. Even after several years of research, quantification of software reliability remains controversial and unresolved issue. Also, existing approaches have assumptions and limitations, which are not acceptable for safety applications. This paper proposes a theoretical approach combining software verification and mutation testing to quantify the software reliability in nuclear safety systems. The theoretical results obtained suggest that the software reliability depends on three factors: the test adequacy, the amount of software verification carried out and the reusability of verified code in the software. The proposed approach may help regulators in licensing computer based safety systems in nuclear reactors.

  5. Description of the tasks of control room operators in German nuclear power plants and support possibilities by advanced computer systems

    International Nuclear Information System (INIS)

    Buettner, W.E.

    1984-01-01

    In course of the development of nuclear power plants the instrumentation and control systems and the information in the control room have been increasing substantially. With this background it is described which operator tasks might be supported by advanced computer aid systems with main emphasis to safety related information and diagnose facilities. Nevertheless, some of this systems under development may be helpful for normal operation modes too. As far as possible recommendations for the realization and test of such systems are made. (orig.) [de

  6. Effects of musicianship and experimental task on perceptual segmentation

    DEFF Research Database (Denmark)

    Hartmann, Martin; Lartillot, Olivier; Toiviainen, Petri

    2015-01-01

    -linear fuzzy integration of basic and interaction descriptors of local musical novelty. We found that musicianship of listeners and segmentation task had an effect on model prediction rate, dimensionality and components. Changes in tonality and rhythm, as well as simultaneous change of these aspects were......The perceptual structure of music is a fundamental issue in music psychology that can be systematically addressed via computational models. This study estimated the contribution of spectral, rhythmic and tonal descriptors for prediction of perceptual segmentation across stimuli. In a real-time task......, 18 musicians and 18 non-musicians indicated perceived instants of significant change for six ongoing musical stimuli. In a second task, 18 musicians parsed the same stimuli using audio editing software to provide non-real-time segmentation annotations. We built computational models based on a non...

  7. Does the medium matter? The interaction of task type and technology on group performance and member reactions.

    Science.gov (United States)

    Straus, S G; McGrath, J E

    1994-02-01

    The authors investigated the hypothesis that as group tasks pose greater requirements for member interdependence, communication media that transmit more social context cues will foster group performance and satisfaction. Seventy-two 3-person groups of undergraduate students worked in either computer-mediated or face-to-face meetings on 3 tasks with increasing levels of interdependence: an idea-generation task, an intellective task, and a judgment task. Results showed few differences between computer-mediated and face-to-face groups in the quality of the work completed but large differences in productivity favoring face-to-face groups. Analysis of productivity and of members' reactions supported the predicted interaction of tasks and media, with greater discrepancies between media conditions for tasks requiring higher levels of coordination. Results are discussed in terms of the implications of using computer-mediated communications systems for group work.

  8. Quantifying Human Performance of a Dynamic Military Target Detection Task: An Application of the Theory of Signal Detection.

    Science.gov (United States)

    1995-06-01

    applied to analyze numerous experimental tasks (Macmillan and Creelman , 1991). One of these tasks, target detection, is the subject research. In...between each associated pair of false alarm rate and hit rate z-scores is d’ for the bias level associated with the pairing (Macmillan and Creelman , 1991...unequal variance in normal distributions (Macmillan and Creelman , 1991). 61 1966). It is described in detail for the interested reader by Green and

  9. Controlling a tactile ERP-BCI in a dual-task

    NARCIS (Netherlands)

    Thurlings, M.E.; Erp, J.B.F. van; Brouwer, A.M.; Werkhoven, P.J.

    2013-01-01

    When using brain–computer interfaces (BCIs) to control a game, the BCI may have to compete with gaming tasks for the same perceptual and cognitive resources.We investigated: 1) if and to what extent event-related potentials (ERPs) and ERP–BCI performance are affected in a dual-task situation; and 2)

  10. Building a cluster computer for the computing grid of tomorrow

    International Nuclear Information System (INIS)

    Wezel, J. van; Marten, H.

    2004-01-01

    The Grid Computing Centre Karlsruhe takes part in the development, test and deployment of hardware and cluster infrastructure, grid computing middleware, and applications for particle physics. The construction of a large cluster computer with thousands of nodes and several PB data storage capacity is a major task and focus of research. CERN based accelerator experiments will use GridKa, one of only 8 world wide Tier-1 computing centers, for its huge computer demands. Computing and storage is provided already for several other running physics experiments on the exponentially expanding cluster. (orig.)

  11. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  12. Gait stability and variability measures show effects of impaired cognition and dual tasking in frail people

    Directory of Open Access Journals (Sweden)

    de Vries Oscar J

    2011-01-01

    Full Text Available Abstract Background Falls in frail elderly are a common problem with a rising incidence. Gait and postural instability are major risk factors for falling, particularly in geriatric patients. As walking requires attention, cognitive impairments are likely to contribute to an increased fall risk. An objective quantification of gait and balance ability is required to identify persons with a high tendency to fall. Recent studies have shown that stride variability is increased in elderly and under dual task condition and might be more sensitive to detect fall risk than walking speed. In the present study we complemented stride related measures with measures that quantify trunk movement patterns as indicators of dynamic balance ability during walking. The aim of the study was to quantify the effect of impaired cognition and dual tasking on gait variability and stability in geriatric patients. Methods Thirteen elderly with dementia (mean age: 82.6 ± 4.3 years and thirteen without dementia (79.4 ± 5.55 recruited from a geriatric day clinic, walked at self-selected speed with and without performing a verbal dual task. The Mini Mental State Examination and the Seven Minute Screen were administered. Trunk accelerations were measured with an accelerometer. In addition to walking speed, mean, and variability of stride times, gait stability was quantified using stochastic dynamical measures, namely regularity (sample entropy, long range correlations and local stability exponents of trunk accelerations. Results Dual tasking significantly (p Conclusions The observed trunk adaptations were a consistent instability factor. These results support the concept that changes in cognitive functions contribute to changes in the variability and stability of the gait pattern. Walking under dual task conditions and quantifying gait using dynamical parameters can improve detecting walking disorders and might help to identify those elderly who are able to adapt walking

  13. Deep learning for multi-task plant phenotyping

    OpenAIRE

    Pound, Michael P.; Atkinson, Jonathan A.; Wells, Darren M.; Pridmore, Tony P.; French, Andrew P.

    2017-01-01

    Plant phenotyping has continued to pose a challenge to computer vision for many years. There is a particular demand to accurately quantify images of crops, and the natural variability and structure of these plants presents unique difficulties. Recently, machine learning approaches have shown impressive results in many areas of computer vision, but these rely on large datasets that are at present not available for crops. We present a new dataset, called ACID, that provides hundreds of accurate...

  14. The data acquisition tasks for TASSO. User manual

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1981-07-01

    The TASSO Data Acquisition System is designed to accept events from various trigger sources at rates of several Hz and to transfer these events to the central DESY IBM Triplex for later offline analysis. A NORD 10S computer is used for this purpose with several data acquisition tasks running under control of a Supervisor task DASSO that is described elsewhere. (author)

  15. Computational Modeling of Human Multiple-Task Performance and Mental Workload

    National Research Council Canada - National Science Library

    Meyer, David

    2004-01-01

    ... (Executive-Process/Interactive Control) was developed, applied to several types of tasks to accurately represent human performance, and inspired to collection of new data that cast new light on the scientific analysis of key phenomena...

  16. Task parameters affecting ergonomic demands and productivity of HVAC duct installation.

    Science.gov (United States)

    Mitropoulos, Panagiotis; Hussain, Sanaa; Guarascio-Howard, Linda; Memarian, Babak

    2014-01-01

    Mechanical installation workers experience work-related musculoskeletal disorders (WMSDs) at high rates. (1) Quantify the ergonomic demands during HVAC installation, (2) identify the tasks and task parameters that generated extreme ergonomic demands, and (3) propose improvements to reduce the WMSDs among mechanical workers. The study focused on installation of rectangular ductwork components using ladders, and analyzed five operations by two mechanical contractors. Using continuous time observational assessment, the videotaped operations were analyzed along two dimensions: (1) the production tasks and durations, and (2) the ergonomic demands for four body regions (neck, arms/shoulders, back, and knees). The analysis identified tasks with low portion of productive time and high portion of extreme postures, and task parameters that generated extreme postures. Duct alignment was the task with the highest portion of extreme postures. The position of the ladder (angle and distance from the duct) was a task parameter that strongly influenced the extreme postures for back, neck and shoulders. Other contributing factors included the difficulty to reach the hand tools when working on the ladder, the congestion of components in the ceiling, and the space between the duct and the ceiling. The identified tasks and factors provide directions for improvement.

  17. Task-Oriented Spoken Dialog System for Second-Language Learning

    Science.gov (United States)

    Kwon, Oh-Woog; Kim, Young-Kil; Lee, Yunkeun

    2016-01-01

    This paper introduces a Dialog-Based Computer Assisted second-Language Learning (DB-CALL) system using task-oriented dialogue processing technology. The system promotes dialogue with a second-language learner for a specific task, such as purchasing tour tickets, ordering food, passing through immigration, etc. The dialog system plays a role of a…

  18. Quantifying the Cumulative Impact of Differences in Care on Prostate Cancer Outcomes

    National Research Council Canada - National Science Library

    Fesinmeyer, Megan

    2007-01-01

    ... the continuum of care contribute to disparity. The second layer of this proposal is the development of a computer model that integrates the complex patterns of care and differences by race identified in the first phase in order to quantify...

  19. A neuronal model of a global workspace in effortful cognitive tasks.

    Science.gov (United States)

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  20. CQPSO scheduling algorithm for heterogeneous multi-core DAG task model

    Science.gov (United States)

    Zhai, Wenzheng; Hu, Yue-Li; Ran, Feng

    2017-07-01

    Efficient task scheduling is critical to achieve high performance in a heterogeneous multi-core computing environment. The paper focuses on the heterogeneous multi-core directed acyclic graph (DAG) task model and proposes a novel task scheduling method based on an improved chaotic quantum-behaved particle swarm optimization (CQPSO) algorithm. A task priority scheduling list was built. A processor with minimum cumulative earliest finish time (EFT) was acted as the object of the first task assignment. The task precedence relationships were satisfied and the total execution time of all tasks was minimized. The experimental results show that the proposed algorithm has the advantage of optimization abilities, simple and feasible, fast convergence, and can be applied to the task scheduling optimization for other heterogeneous and distributed environment.

  1. Use of computer games as an intervention for stroke.

    Science.gov (United States)

    Proffitt, Rachel M; Alankus, Gazihan; Kelleher, Caitlin L; Engsberg, Jack R

    2011-01-01

    Current rehabilitation for persons with hemiparesis after stroke requires high numbers of repetitions to be in accordance with contemporary motor learning principles. The motivational characteristics of computer games can be harnessed to create engaging interventions for persons with hemiparesis after stroke that incorporate this high number of repetitions. The purpose of this case report was to test the feasibility of using computer games as a 6-week home therapy intervention to improve upper extremity function for a person with stroke. One person with left upper extremity hemiparesis after stroke participated in a 6-week home therapy computer game intervention. The games were customized to her preferences and abilities and modified weekly. Her performance was tracked and analyzed. Data from pre-, mid-, and postintervention testing using standard upper extremity measures and the Reaching Performance Scale (RPS) were analyzed. After 3 weeks, the participant demonstrated increased upper extremity range of motion at the shoulder and decreased compensatory trunk movements during reaching tasks. After 6 weeks, she showed functional gains in activities of daily living (ADLs) and instrumental ADLs despite no further improvements on the RPS. Results indicate that computer games have the potential to be a useful intervention for people with stroke. Future work will add additional support to quantify the effectiveness of the games as a home therapy intervention for persons with stroke.

  2. Quantifying uncertainties in the structural response of SSME blades

    Science.gov (United States)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  3. On the development of a computer-based handwriting assessment tool to objectively quantify handwriting proficiency in children.

    Science.gov (United States)

    Falk, Tiago H; Tam, Cynthia; Schellnus, Heidi; Chau, Tom

    2011-12-01

    Standardized writing assessments such as the Minnesota Handwriting Assessment (MHA) can inform interventions for handwriting difficulties, which are prevalent among school-aged children. However, these tests usually involve the laborious task of subjectively rating the legibility of the written product, precluding their practical use in some clinical and educational settings. This study describes a portable computer-based handwriting assessment tool to objectively measure MHA quality scores and to detect handwriting difficulties in children. Several measures are proposed based on spatial, temporal, and grip force measurements obtained from a custom-built handwriting instrument. Thirty-five first and second grade students participated in the study, nine of whom exhibited handwriting difficulties. Students performed the MHA test and were subjectively scored based on speed and handwriting quality using five primitives: legibility, form, alignment, size, and space. Several spatial parameters are shown to correlate significantly (phandwriting legibility and speed, respectively. Using only size and space parameters, promising discrimination between proficient and non-proficient handwriting can be achieved. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Low-order non-spatial effects dominate second-order spatial effects in the texture quantifier analysis of 18F-FDG-PET images.

    Directory of Open Access Journals (Sweden)

    Frank J Brooks

    Full Text Available There is increasing interest in applying image texture quantifiers to assess the intra-tumor heterogeneity observed in FDG-PET images of various cancers. Use of these quantifiers as prognostic indicators of disease outcome and/or treatment response has yielded inconsistent results. We study the general applicability of some well-established texture quantifiers to the image data unique to FDG-PET.We first created computer-simulated test images with statistical properties consistent with clinical image data for cancers of the uterine cervix. We specifically isolated second-order statistical effects from low-order effects and analyzed the resulting variation in common texture quantifiers in response to contrived image variations. We then analyzed the quantifiers computed for FIGOIIb cervical cancers via receiver operating characteristic (ROC curves and via contingency table analysis of detrended quantifier values.We found that image texture quantifiers depend strongly on low-effects such as tumor volume and SUV distribution. When low-order effects are controlled, the image texture quantifiers tested were not able to discern only the second-order effects. Furthermore, the results of clinical tumor heterogeneity studies might be tunable via choice of patient population analyzed.Some image texture quantifiers are strongly affected by factors distinct from the second-order effects researchers ostensibly seek to assess via those quantifiers.

  5. Assessing air medical crew real-time readiness to perform critical tasks.

    Science.gov (United States)

    Braude, Darren; Goldsmith, Timothy; Weiss, Steven J

    2011-01-01

    Air medical transport has had problems with its safety record, attributed in part to human error. Flight crew members (FCMs) must be able to focus on critical safety tasks in the context of a stressful environment. Flight crew members' cognitive readiness (CR) to perform their jobs may be affected by sleep deprivation, personal problems, high workload, and use of alcohol and drugs. The current study investigated the feasibility of using a computer-based cognitive task to assess FCMs' readiness to perform their job. The FCMs completed a short questionnaire to evaluate their physiologic and psychological state at the beginning and end of each shift. The FCMs then performed 3 minutes of a computer-based cognitive task called synthetic work environment (SYNWIN test battery). Task performance was compared with the questionnaire variables using correlation and regression analysis. Differences between the beginning and end of each shift were matched and compared using a paired Students t test. SYNWIN performance was significantly worse at the end of a shift compared with the beginning of the shift (p = 0.028) primarily because of decrement in the memory component. The SYNWIN composite scores were negatively correlated to degree of irritability felt by the participant, both before (r = -0.25) and after (r = -0.34) a shift and were significantly correlated with amount of sleep (0.22), rest (0.30), and life satisfaction (0.30). Performance by FCMs on a simple, rapid, computer-based psychological test correlates well with self-reported sleep, rest, life satisfaction, and irritability. Although further studies are warranted, these findings suggest that assessment of the performance of FCMs on a simple, rapid, computer-based, multitasking battery is feasible as an approach to determine their readiness to perform critical safety tasks through the SYNWIN task battery.

  6. How Important is Conflict Detection to the Conflict Resolution Task?

    Science.gov (United States)

    Mercer, Joey; Gabets, Cynthia; Gomez, Ashley; Edwards, Tamsyn; Bienert, Nancy; Claudatos, Lauren; Homola, Jeffrey R.

    2016-01-01

    To determine the capabilities and limitations of human operators and automation in separation assurance roles, the second of three Human-in-the-Loop (HITL) part-task studies investigates air traffic controllers ability to detect and resolve conflicts under varying task sets, traffic densities, and run lengths. Operations remained within a single sector, staffed by a single controller, and explored, among other things, the controllers conflict resolution performance in conditions with or without their involvement in the conflict detection task. Whereas comparisons of conflict resolution performance between these two conditions are available in a prior publication, this paper explores whether or not other subjective measures display a relationship to that data. Analyses of controller workload and situation awareness measures attempt to quantify their contribution to controllers ability to resolve traffic conflicts.

  7. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-01

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators

  8. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-15

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators.

  9. Letter and Colour Matching Tasks: Parametric Measures of Developmental Working Memory Capacity

    Directory of Open Access Journals (Sweden)

    Tamara L. Powell

    2014-01-01

    Full Text Available We investigated the mediating role of interference in developmental assessments of working memory (WM capacity across childhood, adolescence, and young adulthood. One hundred and forty-two participants completed two versions of visuospatial (colour matching task, CMT and verbal (letter matching task, LMT WM tasks, which systematically varied cognitive load in a high and low interference condition. Results showed similar developmental trajectories across high interference contexts (CMT- and LMT-Complex and divergent developmental growth patterns across low interference contexts (CMT- and LMT-Simple. Performance on tasks requiring greater cognitive control was in closer agreement with developmental predictions relative to simple recall guided tasks that rely solely on the storage components of WM. These findings suggest that developmental WM capacity, as measured by the CMT and LMT paradigms, can be better quantified using high interference contexts, in both content domains, and demonstrate steady increases in WM through to mid-adolescence.

  10. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision

    OpenAIRE

    Zhong, Bineng; Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of tra...

  11. Task based synthesis of serial manipulators

    Directory of Open Access Journals (Sweden)

    Sarosh Patel

    2015-05-01

    Full Text Available Computing the optimal geometric structure of manipulators is one of the most intricate problems in contemporary robot kinematics. Robotic manipulators are designed and built to perform certain predetermined tasks. There is a very close relationship between the structure of the manipulator and its kinematic performance. It is therefore important to incorporate such task requirements during the design and synthesis of the robotic manipulators. Such task requirements and performance constraints can be specified in terms of the required end-effector positions, orientations and velocities along the task trajectory. In this work, we present a comprehensive method to develop the optimal geometric structure (DH parameters of a non-redundant six degree of freedom serial manipulator from task descriptions. In this work we define, develop and test a methodology to design optimal manipulator configurations based on task descriptions. This methodology is devised to investigate all possible manipulator configurations that can satisfy the task performance requirements under imposed joint constraints. Out of all the possible structures, the structures that can reach all the task points with the required orientations are selected. Next, these candidate structures are tested to see whether they can attain end-effector velocities in arbitrary directions within the user defined joint constraints, so that they can deliver the best kinematic performance. Additionally least power consuming configurations are also identified.

  12. Cartoon computation: quantum-like computing without quantum mechanics

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2007-01-01

    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed-they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpretation and allows for a cartoon representation. (fast track communication)

  13. PACS 2000: quality control using the task allocation chart

    Science.gov (United States)

    Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.

    2000-05-01

    Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.

  14. Does spinal excitability scale to the difficulty of the dual-task?

    Science.gov (United States)

    Day, Devon M; Boivin, Mario T; Adkin, Allan L; Tokuno, Craig D

    2017-08-01

    This study examined whether spinal excitability, as measured by the soleus Hoffmann reflex (H-reflex), is scaled to the difficulty level of the dual-task being performed. Twenty-two participants completed a combination of three balance task and three secondary cognitive (visuo-motor) task difficulty levels for a total of nine dual-task conditions. An additional eight participants were tested while performing the same three balance task difficulty levels on its own (i.e., single-tasking). The balance task required participants to maintain their balance on a fixed or rotating stabilometer while the visuo-motor task required participants to respond to moving targets presented on a monitor. Throughout each single- and dual-task trial, H-reflexes were elicited from the soleus. Although dual-task performance, as quantified by visuo-motor task accuracy as well as the root mean square of the stabilometer position and velocity, decreased by 10-34% with increasing dual-task difficulty (p dual-task conditions (p = 0.483-0.758). This contrasts to when participants performed the balance task as a single-task, where the H-reflex amplitude decreased by ~25% from the easy to the hard balance task difficulty level (p = 0.037). In contrast to the commonly reported finding of a reduced soleus H-reflex amplitude when individuals perform a less posturally stable task by itself, the results indicate that spinal excitability is not modulated as a function of dual-task difficulty. It is possible that when an individual's attentional resource capacity is exceeded during dual-tasking, they become ineffective in regulating spinal excitability for balance control.

  15. A queueing model of pilot decision making in a multi-task flight management situation

    Science.gov (United States)

    Walden, R. S.; Rouse, W. B.

    1977-01-01

    Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.

  16. Project Scheduling Heuristics-Based Standard PSO for Task-Resource Assignment in Heterogeneous Grid

    OpenAIRE

    Chen, Ruey-Maw; Wang, Chuin-Mu

    2011-01-01

    The task scheduling problem has been widely studied for assigning resources to tasks in heterogeneous grid environment. Effective task scheduling is an important issue for the performance of grid computing. Meanwhile, the task scheduling problem is an NP-complete problem. Hence, this investigation introduces a named “standard“ particle swarm optimization (PSO) metaheuristic approach to efficiently solve the task scheduling problems in grid. Meanwhile, two promising heuristics based on multimo...

  17. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention

    OpenAIRE

    Frère Annie F; Silva Alessandro P

    2011-01-01

    Abstract Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive c...

  18. Robot Task Commander with Extensible Programming Environment

    Science.gov (United States)

    Hart, Stephen W (Inventor); Yamokoski, John D. (Inventor); Wightman, Brian J (Inventor); Dinh, Duy Paul (Inventor); Gooding, Dustin R (Inventor)

    2014-01-01

    A system for developing distributed robot application-level software includes a robot having an associated control module which controls motion of the robot in response to a commanded task, and a robot task commander (RTC) in networked communication with the control module over a network transport layer (NTL). The RTC includes a script engine(s) and a GUI, with a processor and a centralized library of library blocks constructed from an interpretive computer programming code and having input and output connections. The GUI provides access to a Visual Programming Language (VPL) environment and a text editor. In executing a method, the VPL is opened, a task for the robot is built from the code library blocks, and data is assigned to input and output connections identifying input and output data for each block. A task sequence(s) is sent to the control module(s) over the NTL to command execution of the task.

  19. Elementary School Students' Strategic Learning: Does Task-Type Matter?

    Science.gov (United States)

    Malmberg, Jonna; Järvelä, Sanna; Kirschner, Paul A.

    2014-01-01

    This study investigated what types of learning patterns and strategies elementary school students use to carry out ill- and well-structured tasks. Specifically, it was investigated which and when learning patterns actually emerge with respect to students' task solutions. The present study uses computer log file traces to investigate how…

  20. Geo-information processing service composition for concurrent tasks: A QoS-aware game theory approach

    Science.gov (United States)

    Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong

    2012-10-01

    Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.

  1. A Task-driven Grammar Refactoring Algorithm

    Directory of Open Access Journals (Sweden)

    Ivan Halupka

    2012-01-01

    Full Text Available This paper presents our proposal and the implementation of an algorithm for automated refactoring of context-free grammars. Rather than operating under some domain-specific task, in our approach refactoring is perfomed on the basis of a refactoring task defined by its user. The algorithm and the corresponding refactoring system are called mARTINICA. mARTINICA is able to refactor grammars of arbitrary size and structural complexity. However, the computation time needed to perform a refactoring task with the desired outcome is highly dependent on the size of the grammar. Until now, we have successfully performed refactoring tasks on small and medium-size grammars of Pascal-like languages and parts of the Algol-60 programming language grammar. This paper also briefly introduces the reader to processes occurring in grammar refactoring, a method for describing desired properties that a refactored grammar should fulfill, and there is a discussion of the overall significance of grammar refactoring.

  2. Project Scheduling Heuristics-Based Standard PSO for Task-Resource Assignment in Heterogeneous Grid

    Directory of Open Access Journals (Sweden)

    Ruey-Maw Chen

    2011-01-01

    Full Text Available The task scheduling problem has been widely studied for assigning resources to tasks in heterogeneous grid environment. Effective task scheduling is an important issue for the performance of grid computing. Meanwhile, the task scheduling problem is an NP-complete problem. Hence, this investigation introduces a named “standard“ particle swarm optimization (PSO metaheuristic approach to efficiently solve the task scheduling problems in grid. Meanwhile, two promising heuristics based on multimode project scheduling are proposed to help in solving interesting scheduling problems. They are the best performance resource heuristic and the latest finish time heuristic. These two heuristics applied to the PSO scheme are for speeding up the search of the particle and improving the capability of finding a sound schedule. Moreover, both global communication topology and local ring communication topology are also investigated for efficient study of proposed scheme. Simulation results demonstrate that the proposed approach in this investigation can successfully solve the task-resource assignment problems in grid computing and similar scheduling problems.

  3. Comparison of multi-objective evolutionary approaches for task ...

    Indian Academy of Sciences (India)

    evaluated using standard metrics. Experimental results and performance measures infer that NSGA-II produces quality schedules compared to NSPSO. ...... J 2005 Framework for task scheduling in heterogeneous distributed computing using.

  4. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective

    Directory of Open Access Journals (Sweden)

    Arthur M. Jacobs

    2017-12-01

    Full Text Available In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  5. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective.

    Science.gov (United States)

    Jacobs, Arthur M

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  6. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  7. Task Classification Based Energy-Aware Consolidation in Clouds

    Directory of Open Access Journals (Sweden)

    HeeSeok Choi

    2016-01-01

    Full Text Available We consider a cloud data center, in which the service provider supplies virtual machines (VMs on hosts or physical machines (PMs to its subscribers for computation in an on-demand fashion. For the cloud data center, we propose a task consolidation algorithm based on task classification (i.e., computation-intensive and data-intensive and resource utilization (e.g., CPU and RAM. Furthermore, we design a VM consolidation algorithm to balance task execution time and energy consumption without violating a predefined service level agreement (SLA. Unlike the existing research on VM consolidation or scheduling that applies none or single threshold schemes, we focus on a double threshold (upper and lower scheme, which is used for VM consolidation. More specifically, when a host operates with resource utilization below the lower threshold, all the VMs on the host will be scheduled to be migrated to other hosts and then the host will be powered down, while when a host operates with resource utilization above the upper threshold, a VM will be migrated to avoid using 100% of resource utilization. Based on experimental performance evaluations with real-world traces, we prove that our task classification based energy-aware consolidation algorithm (TCEA achieves a significant energy reduction without incurring predefined SLA violations.

  8. Evaluating the Efficacy of the Cloud for Cluster Computation

    Science.gov (United States)

    Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom

    2012-01-01

    Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.

  9. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  10. Real-time scheduling of software tasks

    International Nuclear Information System (INIS)

    Hoff, L.T.

    1995-01-01

    When designing real-time systems, it is often desirable to schedule execution of software tasks based on the occurrence of events. The events may be clock ticks, interrupts from a hardware device, or software signals from other software tasks. If the nature of the events, is well understood, this scheduling is normally a static part of the system design. If the nature of the events is not completely understood, or is expected to change over time, it may be necessary to provide a mechanism for adjusting the scheduling of the software tasks. RHIC front-end computers (FECs) provide such a mechanism. The goals in designing this mechanism were to be as independent as possible of the underlying operating system, to allow for future expansion of the mechanism to handle new types of events, and to allow easy configuration. Some considerations which steered the design were programming paradigm (object oriented vs. procedural), programming language, and whether events are merely interesting moments in time, or whether they intrinsically have data associated with them. The design also needed to address performance and robustness tradeoffs involving shared task contexts, task priorities, and use of interrupt service routine (ISR) contexts vs. task contexts. This paper will explore these considerations and tradeoffs

  11. Real-time multi-task operators support system

    International Nuclear Information System (INIS)

    Wang He; Peng Minjun; Wang Hao; Cheng Shouyu

    2005-01-01

    The development in computer software and hardware technology and information processing as well as the accumulation in the design and feedback from Nuclear Power Plant (NPP) operation created a good opportunity to develop an integrated Operator Support System. The Real-time Multi-task Operator Support System (RMOSS) has been built to support the operator's decision making process during normal and abnormal operations. RMOSS consists of five system subtasks such as Data Collection and Validation Task (DCVT), Operation Monitoring Task (OMT), Fault Diagnostic Task (FDT), Operation Guideline Task (OGT) and Human Machine Interface Task (HMIT). RMOSS uses rule-based expert system and Artificial Neural Network (ANN). The rule-based expert system is used to identify the predefined events in static conditions and track the operation guideline through data processing. In dynamic status, Back-Propagation Neural Network is adopted for fault diagnosis, which is trained with the Genetic Algorithm. Embedded real-time operation system VxWorks and its integrated environment Tornado II are used as the RMOSS software cross-development. VxGUI is used to design HMI. All of the task programs are designed in C language. The task tests and function evaluation of RMOSS have been done in one real-time full scope simulator. Evaluation results show that each task of RMOSS is capable of accomplishing its functions. (authors)

  12. The dependence of human reliability upon task information content

    International Nuclear Information System (INIS)

    Hermanson, E.M.; Golay, M.W.

    1994-09-01

    The role of human error in safety mishaps is an important factor in system design. As systems become increasingly complex the capacity of the human to deal with the added complexity is diminished. It is therefore crucial to understand the relationship between system complexity and human reliability so that systems may be built in such a way as to minimize human error. One way of understanding this relationship is to quantify system complexity and then measure the human reaction in response to situations of varying complexity. The quantification of system complexity may be performed by determining the information content present in the tasks that the human must execute. The purpose of this work is therefore to build and perform a consistent experiment which will determine the extent to which human reliability depends upon task information content. Two main conclusions may be drawn from this work. The first is that human reliability depends upon task information content. Specifically, as the information content contained in a task increases, the capacity of a human to deal successfully with the task decreases monotonically. Here the definition of total success is the ability to complete the task at hand fully and correctly. Furthermore, there exists a value of information content below which a human can deal with the task successfully, but above which the success of an individual decreases monotonically with increasing information. These ideas should be generalizable to any model where system complexity can be clearly and consistently defined

  13. A method to quantify mechanobiologic forces during zebrafish cardiac development using 4-D light sheet imaging and computational modeling.

    Directory of Open Access Journals (Sweden)

    Vijay Vedula

    2017-10-01

    Full Text Available Blood flow and mechanical forces in the ventricle are implicated in cardiac development and trabeculation. However, the mechanisms of mechanotransduction remain elusive. This is due in part to the challenges associated with accurately quantifying mechanical forces in the developing heart. We present a novel computational framework to simulate cardiac hemodynamics in developing zebrafish embryos by coupling 4-D light sheet imaging with a stabilized finite element flow solver, and extract time-dependent mechanical stimuli data. We employ deformable image registration methods to segment the motion of the ventricle from high resolution 4-D light sheet image data. This results in a robust and efficient workflow, as segmentation need only be performed at one cardiac phase, while wall position in the other cardiac phases is found by image registration. Ventricular hemodynamics are then quantified by numerically solving the Navier-Stokes equations in the moving wall domain with our validated flow solver. We demonstrate the applicability of the workflow in wild type zebrafish and three treated fish types that disrupt trabeculation: (a chemical treatment using AG1478, an ErbB2 signaling inhibitor that inhibits proliferation and differentiation of cardiac trabeculation; (b injection of gata1a morpholino oligomer (gata1aMO suppressing hematopoiesis and resulting in attenuated trabeculation; (c weak-atriumm58 mutant (wea with inhibited atrial contraction leading to a highly undeveloped ventricle and poor cardiac function. Our simulations reveal elevated wall shear stress (WSS in wild type and AG1478 compared to gata1aMO and wea. High oscillatory shear index (OSI in the grooves between trabeculae, compared to lower values on the ridges, in the wild type suggest oscillatory forces as a possible regulatory mechanism of cardiac trabeculation development. The framework has broad applicability for future cardiac developmental studies focused on quantitatively

  14. Can the learning of laparoscopic skills be quantified by the measurements of skill parameters performed in a virtual reality simulator?

    Directory of Open Access Journals (Sweden)

    Natascha Silva Sandy

    2013-06-01

    Full Text Available Purpose To ensure patient safety and surgical efficiency, much emphasis has been placed on the training of laparoscopic skills using virtual reality simulators. The purpose of this study was to determine whether laparoscopic skills can be objectively quantified by measuring specific skill parameters during training in a virtual reality surgical simulator (VRSS. Materials and Methods Ten medical students (with no laparoscopic experience and ten urology residents (PGY3-5 with limited laparoscopic experience were recruited to participate in a ten-week training course in basic laparoscopic skills (camera, cutting, peg transfer and clipping skills on a VRSS. Data were collected from the training sessions. The time that individuals took to complete each task and the errors that they made were analyzed independently. Results The mean time that individuals took to complete tasks was significantly different between the groups (p < 0.05, with the residents being faster than the medical students. The residents' group also completed the tasks with fewer errors. The majority of the subjects in both groups exhibited a significant improvement in their task completion time and error rate. Conclusion The findings in this study demonstrate that laparoscopic skills can be objectively measured in a VRSS based on quantified skill parameters, including the time spent to complete skill tasks and the associated error rate. We conclude that a VRSS is a feasible tool for training and assessing basic laparoscopic skills.

  15. Self-Associations Influence Task-Performance through Bayesian Inference.

    Science.gov (United States)

    Bengtsson, Sara L; Penny, Will D

    2013-01-01

    The way we think about ourselves impacts greatly on our behavior. This paper describes a behavioral study and a computational model that shed new light on this important area. Participants were primed "clever" and "stupid" using a scrambled sentence task, and we measured the effect on response time and error-rate on a rule-association task. First, we observed a confirmation bias effect in that associations to being "stupid" led to a gradual decrease in performance, whereas associations to being "clever" did not. Second, we observed that the activated self-concepts selectively modified attention toward one's performance. There was an early to late double dissociation in RTs in that primed "clever" resulted in RT increase following error responses, whereas primed "stupid" resulted in RT increase following correct responses. We propose a computational model of subjects' behavior based on the logic of the experimental task that involves two processes; memory for rules and the integration of rules with subsequent visual cues. The model incorporates an adaptive decision threshold based on Bayes rule, whereby decision thresholds are increased if integration was inferred to be faulty. Fitting the computational model to experimental data confirmed our hypothesis that priming affects the memory process. This model explains both the confirmation bias and double dissociation effects and demonstrates that Bayesian inferential principles can be used to study the effect of self-concepts on behavior.

  16. Pixel-Level Deep Segmentation: Artificial Intelligence Quantifies Muscle on Computed Tomography for Body Morphometric Analysis.

    Science.gov (United States)

    Lee, Hyunkwang; Troschel, Fabian M; Tajmir, Shahein; Fuchs, Georg; Mario, Julia; Fintelmann, Florian J; Do, Synho

    2017-08-01

    Pretreatment risk stratification is key for personalized medicine. While many physicians rely on an "eyeball test" to assess whether patients will tolerate major surgery or chemotherapy, "eyeballing" is inherently subjective and difficult to quantify. The concept of morphometric age derived from cross-sectional imaging has been found to correlate well with outcomes such as length of stay, morbidity, and mortality. However, the determination of the morphometric age is time intensive and requires highly trained experts. In this study, we propose a fully automated deep learning system for the segmentation of skeletal muscle cross-sectional area (CSA) on an axial computed tomography image taken at the third lumbar vertebra. We utilized a fully automated deep segmentation model derived from an extended implementation of a fully convolutional network with weight initialization of an ImageNet pre-trained model, followed by post processing to eliminate intramuscular fat for a more accurate analysis. This experiment was conducted by varying window level (WL), window width (WW), and bit resolutions in order to better understand the effects of the parameters on the model performance. Our best model, fine-tuned on 250 training images and ground truth labels, achieves 0.93 ± 0.02 Dice similarity coefficient (DSC) and 3.68 ± 2.29% difference between predicted and ground truth muscle CSA on 150 held-out test cases. Ultimately, the fully automated segmentation system can be embedded into the clinical environment to accelerate the quantification of muscle and expanded to volume analysis of 3D datasets.

  17. Collaborative drawing with interactive table in physics: Groups’ regulation and task interpretation

    NARCIS (Netherlands)

    Mykkanen, A.; Gijlers, Aaltje H.; Jarvenoja, H.; Jarvela, S.; Bollen, Lars

    2015-01-01

    This study explores the relationship between secondary school students’ (N=36, nine groups) group members’ task interpretation and individual and group level regulation during collaborative computer- supported drawing task. Furthermore, it investigates how these factors are related to students

  18. Video and computer-based interactive exercises are safe and improve task-specific balance in geriatric and neurological rehabilitation: a randomised trial

    Directory of Open Access Journals (Sweden)

    Maayken van den Berg

    2016-01-01

    Full Text Available Question: Does adding video/computer-based interactive exercises to inpatient geriatric and neurological rehabilitation improve mobility outcomes? Is it feasible and safe? Design: Randomised trial. Participants: Fifty-eight rehabilitation inpatients. Intervention: Physiotherapist-prescribed, tailored, video/computer-based interactive exercises for 1 hour on weekdays, mainly involving stepping and weight-shifting exercises. Outcome measures: The primary outcome was the Short Physical Performance Battery (0 to 3 at 2 weeks. Secondary outcomes were: Maximal Balance Range (mm; Step Test (step count; Rivermead Mobility Index (0 to 15; activity levels; Activity Measure for Post Acute Care Basic Mobility (18 to 72 and Daily Activity (15 to 60; Falls Efficacy Scale (10 to 40, ED5D utility score (0 to 1; Reintegration to Normal Living Index (0 to 100; System Usability Scale (0 to 100 and Physical Activity Enjoyment Scale (0 to 126. Safety was determined from adverse events during intervention. Results: At 2 weeks the between-group difference in the primary outcome (0.1, 95% CI –0.2 to 0.3 was not statistically significant. The intervention group performed significantly better than usual care for Maximal Balance Range (38 mm difference after baseline adjustment, 95% CI 6 to 69. Other secondary outcomes were not statistically significant. Fifty-eight (55% of the eligible patients agreed to participate, 25/29 (86% completed the intervention and 10 (39% attended > 70% of sessions, with a mean of 5.6 sessions (SD 3.3 attended and overall average duration of 4.5 hours (SD 3.1. Average scores were 62 (SD 21 for the System Usability Scale and 62 (SD 8 for the Physical Activity Enjoyment Scale. There were no adverse events. Conclusion: The addition of video/computer-based interactive exercises to usual rehabilitation is a safe and feasible way to increase exercise dose, but is not suitable for all. Adding the exercises to usual rehabilitation resulted in task

  19. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    Science.gov (United States)

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  20. Operational testing of a figure of merit for overall task performance

    Science.gov (United States)

    Lemay, Moira

    1990-01-01

    An overall indicator or figure of merit (FOM), for the quality of pilot performance is needed to define optimal workload levels, predict system failure, measure the impact of new automation in the cockpit, and define the relative contributions of subtasks to overall task performance. A normative FOM was developed based on the calculation of a standard score for each component of a complex task. It reflected some effects, detailed in an earlier study, of the introduction of new data link technology into the cockpit. Since the technique showed promise, further testing was done. A new set of data was obtained using the recently developed Multi-Attribute Task Battery. This is a complex battery consisting of four tasks which can be varied in task demand, and on which performance measures can be obtained. This battery was presented to 12 subjects in a 20 minute trial at each of three levels of workload or task demand, and performance measures collected on all four tasks. The NASA-TLX workload rating scale was presented at minutes 6, 12, and 18, of each trial. A figure of merit was then obtained for each run of the battery by calculating a mean, SD, and standard score for each task. Each task contributed its own proportion to the overall FOM, and relative contributions changed with increasing workload. Thus, the FOM shows the effect of task changes, not only on the individual task that is changed, but also on the performance of other tasks and of the whole task. The cost to other tasks of maintaining constant performance on an individual task can be quantified.

  1. Quantifier spreading: children misled by ostensive cues

    Directory of Open Access Journals (Sweden)

    Katalin É. Kiss

    2017-04-01

    Full Text Available This paper calls attention to a methodological problem of acquisition experiments. It shows that the economy of the stimulus employed in child language experiments may lend an increased ostensive effect to the message communicated to the child. Thus, when the visual stimulus in a sentence-picture matching task is a minimal model abstracting away from the details of the situation, children often regard all the elements of the stimulus as ostensive clues to be represented in the corresponding sentence. The use of such minimal stimuli is mistaken when the experiment aims to test whether or not a certain element of the stimulus is relevant for the linguistic representation or interpretation. The paper illustrates this point by an experiment involving quantifier spreading. It is claimed that children find a universally quantified sentence like 'Every girl is riding a bicycle 'to be a false description of a picture showing three girls riding bicycles and a solo bicycle because they are misled to believe that all the elements in the visual stimulus are relevant, hence all of them are to be represented by the corresponding linguistic description. When the iconic drawings were replaced by photos taken in a natural environment rich in accidental details, the occurrence of quantifier spreading was radically reduced. It is shown that an extra object in the visual stimulus can lead to the rejection of the sentence also in the case of sentences involving no quantification, which gives further support to the claim that the source of the problem is not (or not only the grammatical or cognitive difficulty of quantification but the unintended ostensive effect of the extra object.  This article is part of the special collection: Acquisition of Quantification

  2. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  3. Imaging gait analysis: An fMRI dual task study.

    Science.gov (United States)

    Bürki, Céline N; Bridenbaugh, Stephanie A; Reinhardt, Julia; Stippich, Christoph; Kressig, Reto W; Blatow, Maria

    2017-08-01

    In geriatric clinical diagnostics, gait analysis with cognitive-motor dual tasking is used to predict fall risk and cognitive decline. To date, the neural correlates of cognitive-motor dual tasking processes are not fully understood. To investigate these underlying neural mechanisms, we designed an fMRI paradigm to reproduce the gait analysis. We tested the fMRI paradigm's feasibility in a substudy with fifteen young adults and assessed 31 healthy older adults in the main study. First, gait speed and variability were quantified using the GAITRite © electronic walkway. Then, participants lying in the MRI-scanner were stepping on pedals of an MRI-compatible stepping device used to imitate gait during functional imaging. In each session, participants performed cognitive and motor single tasks as well as cognitive-motor dual tasks. Behavioral results showed that the parameters of both gait analyses, GAITRite © and fMRI, were significantly positively correlated. FMRI results revealed significantly reduced brain activation during dual task compared to single task conditions. Functional ROI analysis showed that activation in the superior parietal lobe (SPL) decreased less from single to dual task condition than activation in primary motor cortex and in supplementary motor areas. Moreover, SPL activation was increased during dual tasks in subjects exhibiting lower stepping speed and lower executive control. We were able to simulate walking during functional imaging with valid results that reproduce those from the GAITRite © gait analysis. On the neural level, SPL seems to play a crucial role in cognitive-motor dual tasking and to be linked to divided attention processes, particularly when motor activity is involved.

  4. Quantum computing with trapped ions

    International Nuclear Information System (INIS)

    Haeffner, H.; Roos, C.F.; Blatt, R.

    2008-01-01

    Quantum computers hold the promise of solving certain computational tasks much more efficiently than classical computers. We review recent experimental advances towards a quantum computer with trapped ions. In particular, various implementations of qubits, quantum gates and some key experiments are discussed. Furthermore, we review some implementations of quantum algorithms such as a deterministic teleportation of quantum information and an error correction scheme

  5. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  6. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  7. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  8. Quantifying Leg Movement Activity During Sleep.

    Science.gov (United States)

    Ferri, Raffaele; Fulda, Stephany

    2016-12-01

    Currently, 2 sets of similar rules for recording and scoring leg movement (LM) exist, including periodic LM during sleep (PLMS) and periodic LM during wakefulness. The former were published in 2006 by a task force of the International Restless Legs Syndrome Study Group, and the second in 2007 by the American Academy of Sleep Medicine. This article reviews the basic recording methods, scoring rules, and computer-based programs for PLMS. Less frequent LM activities, such as alternating leg muscle activation, hypnagogic foot tremor, high-frequency LMs, and excessive fragmentary myoclonus are briefly described. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. ExM:System Support for Extreme-Scale, Many-Task Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S

    2011-05-31

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastest computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)

  10. The Dark Side of Micro-Task Marketplaces: Characterizing Fiverr and Automatically Detecting Crowdturfing

    OpenAIRE

    Lee, Kyumin; Webb, Steve; Ge, Hancheng

    2014-01-01

    As human computation on crowdsourcing systems has become popular and powerful for performing tasks, malicious users have started misusing these systems by posting malicious tasks, propagating manipulated contents, and targeting popular web services such as online social networks and search engines. Recently, these malicious users moved to Fiverr, a fast-growing micro-task marketplace, where workers can post crowdturfing tasks (i.e., astroturfing campaigns run by crowd workers) and malicious c...

  11. Safety analysis of patient transfers and handling tasks.

    Science.gov (United States)

    Vieira, Er; Kumar, S

    2009-10-01

    Low-back disorders are related to biomechanical demands, and nurses are among the professionals with the highest rates. Quantification of risk factors is important for safety assessment and reduction of low-back disorders. This study aimed to quantify physical demands of frequent nursing tasks and provide evidence-based recommendations to increase low-back safety. Thirty-six volunteer female nurses participated in a cross-sectional study of nine nursing tasks. Lumbar range of motion (ROM) and motion during nursing tasks were measured. Compression and shear forces at L5/S1, ligament strain and percentage of population without sufficient torso strength to perform 14 phases of nine nursing tasks were estimated. Peak flexions during trolley-to-bed, bed-to-chair and chair-to-bed transfers reached the maximum flexion ROM of the nurses. Average lumbar flexion during trolley-to-bed transfers was >50% of flexion ROM, being higher than during all other tasks. Mean (SD) compression at L5/S1 (4754 N (437 N)) and population without sufficient torso strength (37% (9%)) were highest during the pushing phase of bed-to-trolley transfers. Shear force (487 N (40 N)) and ligament strain (14% (5%)) were highest during the pulling phase of trolley-to-bed transfers. Nursing tasks impose high biomechanical demands on the lumbar spine. Excessive lumbar flexion and forces are critical aspects of manual transfers requiring most of the nurses' capabilities. Evidence-based recommendations to improve low-back safety in common nursing tasks were provided. Fitness to work, job modifications and training programs can now be designed and assessed based on the results.

  12. Computer research in teaching geometry future bachelors

    Directory of Open Access Journals (Sweden)

    Aliya V. Bukusheva

    2017-12-01

    Full Text Available The article is devoted to the study of the problem of usage educational studies and experiments in the geometric education of IT specialists. We consider research method applied in teaching Computer Geometry intending Bachelors studying `Mathematics and Computer Science` 02.03.01. Examples of educational and research geometric problems that require usage of computer means in order to be solved are given. These tasks are considered as variations of educational and research tasks creating problems that demand experiments with dynamic models of mathematic objects in order to be solved.

  13. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    Science.gov (United States)

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  14. Women and Computers: Effects of Stereotype Threat on Attribution of Failure

    Science.gov (United States)

    Koch, Sabine C.; Muller, Stephanie M.; Sieverding, Monika

    2008-01-01

    This study investigated whether stereotype threat can influence women's attributions of failure in a computer task. Male and female college-age students (n = 86, 16-21 years old) from Germany were asked to work on a computer task and were hinted beforehand that in this task, either (a) men usually perform better than women do (negative threat…

  15. Why Don't All Professors Use Computers?

    Science.gov (United States)

    Drew, David Eli

    1989-01-01

    Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…

  16. Quantifying Postural Control during Exergaming Using Multivariate Whole-Body Movement Data: A Self-Organizing Maps Approach.

    Directory of Open Access Journals (Sweden)

    Mike van Diest

    Full Text Available Exergames are becoming an increasingly popular tool for training balance ability, thereby preventing falls in older adults. Automatic, real time, assessment of the user's balance control offers opportunities in terms of providing targeted feedback and dynamically adjusting the gameplay to the individual user, yet algorithms for quantification of balance control remain to be developed. The aim of the present study was to identify movement patterns, and variability therein, of young and older adults playing a custom-made weight-shifting (ice-skating exergame.Twenty older adults and twenty young adults played a weight-shifting exergame under five conditions of varying complexity, while multi-segmental whole-body movement data were captured using Kinect. Movement coordination patterns expressed during gameplay were identified using Self Organizing Maps (SOM, an artificial neural network, and variability in these patterns was quantified by computing Total Trajectory Variability (TTvar. Additionally a k Nearest Neighbor (kNN classifier was trained to discriminate between young and older adults based on the SOM features.Results showed that TTvar was significantly higher in older adults than in young adults, when playing the exergame under complex task conditions. The kNN classifier showed a classification accuracy of 65.8%.Older adults display more variable sway behavior than young adults, when playing the exergame under complex task conditions. The SOM features characterizing movement patterns expressed during exergaming allow for discriminating between young and older adults with limited accuracy. Our findings contribute to the development of algorithms for quantification of balance ability during home-based exergaming for balance training.

  17. Aplikasi Penjadwalan Tugas Berbasis Mobile Device Didukung Google Task Dan Google Drive

    OpenAIRE

    Anggraini, Elisa Yuni; Wibowo, Adi; Dewi, Lily Puspa

    2017-01-01

    As the increased of work productivity, many task scheduling applications are emerging. Each of Task scheduling applications has its own advantages to similar competitors. The applications helps the user to remember if one of task was approaching deadline, and to store activities' data. However, to fulfil the task, we need a container to store important files in a safe place. In recent years, the use of Cloud Computing is growing because the data is safely stored. In the applications mention...

  18. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  19. EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks.

    Science.gov (United States)

    Berka, Chris; Levendowski, Daniel J; Lumicao, Michelle N; Yau, Alan; Davis, Gene; Zivkovic, Vladimir T; Olmstead, Richard E; Tremoulet, Patrice D; Craven, Patrick L

    2007-05-01

    The ability to continuously and unobtrusively monitor levels of task engagement and mental workload in an operational environment could be useful in identifying more accurate and efficient methods for humans to interact with technology. This information could also be used to optimize the design of safer, more efficient work environments that increase motivation and productivity. The present study explored the feasibility of monitoring electroencephalo-graphic (EEG) indices of engagement and workload acquired unobtrusively and quantified during performance of cognitive tests. EEG was acquired from 80 healthy participants with a wireless sensor headset (F3-F4,C3-C4,Cz-POz,F3-Cz,Fz-C3,Fz-POz) during tasks including: multi-level forward/backward-digit-span, grid-recall, trails, mental-addition, 20-min 3-Choice Vigilance, and image-learning and memory tests. EEG metrics for engagement and workload were calculated for each 1 -s of EEG. Across participants, engagement but not workload decreased over the 20-min vigilance test. Engagement and workload were significantly increased during the encoding period of verbal and image-learning and memory tests when compared with the recognition/ recall period. Workload but not engagement increased linearly as level of difficulty increased in forward and backward-digit-span, grid-recall, and mental-addition tests. EEG measures correlated with both subjective and objective performance metrics. These data in combination with previous studies suggest that EEG engagement reflects information-gathering, visual processing, and allocation of attention. EEG workload increases with increasing working memory load and during problem solving, integration of information, analytical reasoning, and may be more reflective of executive functions. Inspection of EEG on a second-by-second timescale revealed associations between workload and engagement levels when aligned with specific task events providing preliminary evidence that second

  20. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    Science.gov (United States)

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Shakespeare and other English Renaissance authors as characterized by Information Theory complexity quantifiers

    Science.gov (United States)

    Rosso, Osvaldo A.; Craig, Hugh; Moscato, Pablo

    2009-03-01

    We introduce novel Information Theory quantifiers in a computational linguistic study that involves a large corpus of English Renaissance literature. The 185 texts studied (136 plays and 49 poems in total), with first editions that range from 1580 to 1640, form a representative set of its period. Our data set includes 30 texts unquestionably attributed to Shakespeare; in addition we also included A Lover’s Complaint, a poem which generally appears in Shakespeare collected editions but whose authorship is currently in dispute. Our statistical complexity quantifiers combine the power of Jensen-Shannon’s divergence with the entropy variations as computed from a probability distribution function of the observed word use frequencies. Our results show, among other things, that for a given entropy poems display higher complexity than plays, that Shakespeare’s work falls into two distinct clusters in entropy, and that his work is remarkable for its homogeneity and for its closeness to overall means.

  2. A Rich Assessment Task as a Window into Students' Multiplicative Reasoning

    Science.gov (United States)

    Downton, Ann; Wright, Vince

    2016-01-01

    This study explored the potential of a rich assessment task to reveal students' multiplicative thinking in respect to a hypothetical learning trajectory. Thirty pairs of students in grades 5 and 6 attempted the task. Twenty-two pairs applied multiplicative structure to find the number of items in arrays. However counting and computational errors…

  3. Quantifying performance on an outdoor agility drill using foot-mounted inertial measurement units.

    Directory of Open Access Journals (Sweden)

    Antonia M Zaferiou

    Full Text Available Running agility is required for many sports and other physical tasks that demand rapid changes in body direction. Quantifying agility skill remains a challenge because measuring rapid changes of direction and quantifying agility skill from those measurements are difficult to do in ways that replicate real task/game play situations. The objectives of this study were to define and to measure agility performance for a (five-cone agility drill used within a military obstacle course using data harvested from two foot-mounted inertial measurement units (IMUs. Thirty-two recreational athletes ran an agility drill while wearing two IMUs secured to the tops of their athletic shoes. The recorded acceleration and angular rates yield estimates of the trajectories, velocities and accelerations of both feet as well as an estimate of the horizontal velocity of the body mass center. Four agility performance metrics were proposed and studied including: 1 agility drill time, 2 horizontal body speed, 3 foot trajectory turning radius, and 4 tangential body acceleration. Additionally, the average horizontal ground reaction during each footfall was estimated. We hypothesized that shorter agility drill performance time would be observed with small turning radii and large tangential acceleration ranges and body speeds. Kruskal-Wallis and mean rank post-hoc statistical analyses revealed that shorter agility drill performance times were observed with smaller turning radii and larger tangential acceleration ranges and body speeds, as hypothesized. Moreover, measurements revealed the strategies that distinguish high versus low performers. Relative to low performers, high performers used sharper turns, larger changes in body speed (larger tangential acceleration ranges, and shorter duration footfalls that generated larger horizontal ground reactions during the turn phases. Overall, this study advances the use of foot-mounted IMUs to quantify agility performance in

  4. An Interaction of Screen Colour and Lesson Task in CAL

    Science.gov (United States)

    Clariana, Roy B.

    2004-01-01

    Colour is a common feature in computer-aided learning (CAL), though the instructional effects of screen colour are not well understood. This investigation considers the effects of different CAL study tasks with feedback on posttest performance and on posttest memory of the lesson colour scheme. Graduate students (n=68) completed a computer-based…

  5. Self-associations influence task-performance through Bayesian inference

    Directory of Open Access Journals (Sweden)

    Sara L Bengtsson

    2013-08-01

    Full Text Available The way we think about ourselves impacts greatly on our behaviour. This paper describes a behavioural study and a computational model that sheds new light on this important area. Participants were primed 'clever' and 'stupid' using a scrambled sentence task, and we measured the effect on response time and error-rate on a rule-association task. First, we observed a confirmation bias effect in that associations to being 'stupid' led to a gradual decrease in performance, whereas associations to being 'clever' did not. Second, we observed that the activated self-concepts selectively modified attention towards one's performance. There was an early to late double dissociation in RTs in that primed 'clever' resulted in RT increase following error responses, whereas primed 'stupid' resulted in RT increase following correct responses. We propose a computational model of subjects' behaviour based on the logic of the experimental task that involves two processes; memory for rules and the integration of rules with subsequent visual cues. The model also incorporates an adaptive decision threshold based on Bayes rule, whereby decision thresholds are increased if integration was inferred to be faulty. Fitting the computational model to experimental data confirmed our hypothesis that priming affects the memory process. This model explains both the confirmation bias and double dissociation effects and demonstrates that Bayesian inferential principles can be used to study the effect of self-concepts on behaviour.

  6. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  7. Robust visual tracking via multi-task sparse learning

    KAUST Repository

    Zhang, Tianzhu

    2012-06-01

    In this paper, we formulate object tracking in a particle filter framework as a multi-task sparse learning problem, which we denote as Multi-Task Tracking (MTT). Since we model particles as linear combinations of dictionary templates that are updated dynamically, learning the representation of each particle is considered a single task in MTT. By employing popular sparsity-inducing p, q mixed norms (p D; 1), we regularize the representation problem to enforce joint sparsity and learn the particle representations together. As compared to previous methods that handle particles independently, our results demonstrate that mining the interdependencies between particles improves tracking performance and overall computational complexity. Interestingly, we show that the popular L 1 tracker [15] is a special case of our MTT formulation (denoted as the L 11 tracker) when p q 1. The learning problem can be efficiently solved using an Accelerated Proximal Gradient (APG) method that yields a sequence of closed form updates. As such, MTT is computationally attractive. We test our proposed approach on challenging sequences involving heavy occlusion, drastic illumination changes, and large pose variations. Experimental results show that MTT methods consistently outperform state-of-the-art trackers. © 2012 IEEE.

  8. The appropriateness of TACOM for a task complexity measure for emergency operating procedures of nuclear power plants - A comparison with OPAS scores

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea

    2007-01-01

    It is well known that complicated procedures frequently cause human performance related problems that can result in a serious consequence. Unfortunately a systematic framework to evaluate the complexity of procedures is very rare. For this reason Park et al. suggested a measure called TACOM (Task Complexity) which is able to quantify the complexity of tasks stipulated in procedures. In addition, it was observed that there is a significant correlation between averaged task performance time data and estimated TACOM scores. In this study, for an additional verification activity, TACOM scores are compared with operators' performance data that are measured by Operator Performance Assessment System (OPAS). As a result, it is believed that TACOM scores seem to be meaningfully correlated with OPAS scores. Thus, it is reasonable to expect that the result of this study can be regarded as a supplementary evidence for supporting the fact that TACOM measure is applicable for quantifying the complexity of tasks to be done by operators

  9. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    Science.gov (United States)

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  10. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  11. Quantifying multiple trace elements in uranium ore concentrates. An interlaboratory comparison

    International Nuclear Information System (INIS)

    Buerger, S.; Boulyga, S.F.; Penkin, M.V.; Jovanovic, S.; Lindvall, R.; Rasmussen, G.; Riciputi, L.

    2014-01-01

    An intercomparison was organized, with six laboratories tasked to quantify sixty-nine impurities in two uranium materials. The main technique employed for analysis was inductively coupled plasma mass spectrometry in combination with matrix-matched external calibration. The results presented highlight the current state-of-the-practice; lessons learned include previously unaccounted polyatomic interferences, issues related to sample dissolution, blank correction and calibration, and the challenge of estimating measurement uncertainties. The exercise yielded consensus values for the two analysed materials, suitable for use as laboratory standards to partially fill a gap in the availability of uranium reference materials characterized for impurities. (author)

  12. Overcoming the Obstacle of Poor Knowledge in Proving Geometry Tasks

    Directory of Open Access Journals (Sweden)

    Zlatan Magajna

    2013-12-01

    Full Text Available Proving in school geometry is not just about validating the truth of a claim. In the school setting, the main function of the proof is to convince someone that a claim is true by providing an explanation. Students consider proving to be difficult; in fact, they find the very concept of proof demanding. Proving a claim in planar geometry involves several processes, the most salient being visual observation and deductive argumentation. These two processes are interwoven, but often poor observation hinders deductive argumentation. In the present article, we consider the possibility of overcoming the obstacle of a student’s poor observation by making use of computer-aided observation with appropriate software. We present the results of two small-scale research projects, both of which indicate that students are able to work out considerably more deductions if computer-aided observation is used. Not all students use computer-aided observation effectively in proving tasks: some find an exhaustive computer-provided list of properties confusing and are not able to choose the properties that are relevant to the task.

  13. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  14. Computer vision camera with embedded FPGA processing

    Science.gov (United States)

    Lecerf, Antoine; Ouellet, Denis; Arias-Estrada, Miguel

    2000-03-01

    Traditional computer vision is based on a camera-computer system in which the image understanding algorithms are embedded in the computer. To circumvent the computational load of vision algorithms, low-level processing and imaging hardware can be integrated in a single compact module where a dedicated architecture is implemented. This paper presents a Computer Vision Camera based on an open architecture implemented in an FPGA. The system is targeted to real-time computer vision tasks where low level processing and feature extraction tasks can be implemented in the FPGA device. The camera integrates a CMOS image sensor, an FPGA device, two memory banks, and an embedded PC for communication and control tasks. The FPGA device is a medium size one equivalent to 25,000 logic gates. The device is connected to two high speed memory banks, an IS interface, and an imager interface. The camera can be accessed for architecture programming, data transfer, and control through an Ethernet link from a remote computer. A hardware architecture can be defined in a Hardware Description Language (like VHDL), simulated and synthesized into digital structures that can be programmed into the FPGA and tested on the camera. The architecture of a classical multi-scale edge detection algorithm based on a Laplacian of Gaussian convolution has been developed to show the capabilities of the system.

  15. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  16. Computing shifts to monitor ATLAS distributed computing infrastructure and operations

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00068610; The ATLAS collaboration; Barberis, Dario; Crepe-Renaudin, Sabine Chrystel; De, Kaushik; Fassi, Farida; Stradling, Alden; Svatos, Michal; Vartapetian, Armen; Wolters, Helmut

    2017-01-01

    The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run 2 in 2015. The main goal was to rely on a person with a good overview of the ADC activities to ease the ADC experts’ workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run 1, this task was accomplished by a person of the expert team called the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run 2. The CRC position was proposed to cover some of the AMODs former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help with the training of future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing...

  17. Computing shifts to monitor ATLAS distributed computing infrastructure and operations

    CERN Document Server

    Adam Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run2 in 2015. The main goal was to rely on a person with a good overview of the ADC activities to ease the ADC experts' workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run1, this task was accomplished by the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run2. The CRC position was proposed to cover some of the AMOD’s former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help train future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing ADC in relevant meetings. The CRC also facilitates ...

  18. Supporting collaborative computing and interaction

    International Nuclear Information System (INIS)

    Agarwal, Deborah; McParland, Charles; Perry, Marcia

    2002-01-01

    To enable collaboration on the daily tasks involved in scientific research, collaborative frameworks should provide lightweight and ubiquitous components that support a wide variety of interaction modes. We envision a collaborative environment as one that provides a persistent space within which participants can locate each other, exchange synchronous and asynchronous messages, share documents and applications, share workflow, and hold videoconferences. We are developing the Pervasive Collaborative Computing Environment (PCCE) as such an environment. The PCCE will provide integrated tools to support shared computing and task control and monitoring. This paper describes the PCCE and the rationale for its design

  19. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    NARCIS (Netherlands)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michael Christophe; Guillot, Gilles

    2015-01-01

    In a recent paper, Bradburd et al. (2013) proposed a model to quantify the relative effect ofgeographic and environmental distance on genetic differentiation. Here, we enhance this method in several ways. 1. We modify the covariance model so as to fit better with mainstream geostatistical models and

  20. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    Science.gov (United States)

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  1. Private quantum computation: an introduction to blind quantum computing and related protocols

    Science.gov (United States)

    Fitzsimons, Joseph F.

    2017-06-01

    Quantum technologies hold the promise of not only faster algorithmic processing of data, via quantum computation, but also of more secure communications, in the form of quantum cryptography. In recent years, a number of protocols have emerged which seek to marry these concepts for the purpose of securing computation rather than communication. These protocols address the task of securely delegating quantum computation to an untrusted device while maintaining the privacy, and in some instances the integrity, of the computation. We present a review of the progress to date in this emerging area.

  2. Effects of an attention demanding task on dynamic stability during treadmill walking

    Directory of Open Access Journals (Sweden)

    Troy Karen L

    2008-04-01

    Full Text Available Abstract Background People exhibit increased difficulty balancing when they perform secondary attention-distracting tasks while walking. However, a previous study by Grabiner and Troy (J. Neuroengineering Rehabil., 2005 found that young healthy subjects performing a concurrent Stroop task while walking on a motorized treadmill exhibited decreased step width variability. However, measures of variability do not directly quantify how a system responds to perturbations. This study re-analyzed data from Grabiner and Troy 2005 to determine if performing the concurrent Stroop task directly affected the dynamic stability of walking in these same subjects. Methods Thirteen healthy volunteers walked on a motorized treadmill at their self-selected constant speed for 10 minutes both while performing the Stroop test and during undisturbed walking. This Stroop test consisted of projecting images of the name of one color, printed in text of a different color, onto a wall and asking subjects to verbally identify the color of the text. Three-dimensional motions of a marker attached to the base of the neck (C5/T1 were recorded. Marker velocities were calculated over 3 equal intervals of 200 sec each in each direction. Mean variability was calculated for each time series as the average standard deviation across all strides. Both "local" and "orbital" dynamic stability were quantified for each time series using previously established methods. These measures directly quantify how quickly small perturbations grow or decay, either continuously in real time (local or discretely from one cycle to the next (orbital. Differences between Stroop and Control trials were evaluated using a 2-factor repeated measures ANOVA. Results Mean variability of trunk movements was significantly reduced during the Stroop tests compared to normal walking. Conversely, local and orbital stability results were mixed: some measures showed slight increases, while others showed slight decreases

  3. Masticatory muscle activity during deliberately performed oral tasks

    International Nuclear Information System (INIS)

    Farella, M; Palla, S; Erni, S; Gallo, L M; Michelotti, A

    2008-01-01

    The aim of this study was to investigate masticatory muscle activity during deliberately performed functional and non-functional oral tasks. Electromyographic (EMG) surface activity was recorded unilaterally from the masseter, anterior temporalis and suprahyoid muscles in 11 subjects (5 men, 6 women; age = 34.6 ± 10.8 years), who were accurately instructed to perform 30 different oral tasks under computer guidance using task markers. Data were analyzed by descriptive statistics, repeated measurements analysis of variance (ANOVA) and hierarchical cluster analysis. The maximum EMG amplitude of the masseter and anterior temporalis muscles was more often found during hard chewing tasks than during maximum clenching tasks. The relative contribution of masseter and anterior temporalis changed across the tasks examined (F ≥ 5.2; p ≤ 0.001). The masseter muscle was significantly (p ≤ 0.05) more active than the anterior temporalis muscle during tasks involving incisal biting, jaw protrusion, laterotrusion and jaw cupping, the difference being statistically significant (p ≤ 0.05). The anterior temporalis muscle was significantly (p ≤ 0.01) more active than the masseter muscle during tasks performed in intercuspal position, during tooth grinding, and during hard chewing on the working side. Based upon the relative contribution of the masseter, anterior temporalis, and suprahyoid muscles, the investigated oral tasks could be grouped into six separate clusters. The findings provided further insight into muscle- and task-specific EMG patterns during functional and non-functional oral behaviors

  4. Showing a model's eye movements in examples does not improve learning of problem-solving tasks

    NARCIS (Netherlands)

    van Marlen, Tim; van Wermeskerken, Margot; Jarodzka, Halszka; van Gog, Tamara

    2016-01-01

    Eye movement modeling examples (EMME) are demonstrations of a computer-based task by a human model (e.g., a teacher), with the model's eye movements superimposed on the task to guide learners' attention. EMME have been shown to enhance learning of perceptual classification tasks; however, it is an

  5. Initiating an ergonomic analysis. A process for jobs with highly variable tasks.

    Science.gov (United States)

    Conrad, K M; Lavender, S A; Reichelt, P A; Meyer, F T

    2000-09-01

    Occupational health nurses play a vital role in addressing ergonomic problems in the workplace. Describing and documenting exposure to ergonomic risk factors is a relatively straightforward process in jobs in which the work is repetitive. In other types of work, the analysis becomes much more challenging because tasks may be repeated infrequently, or at irregular time intervals, or under different environmental and temporal conditions, thereby making it difficult to observe a "representative" sample of the work performed. This article describes a process used to identify highly variable job tasks for ergonomic analyses. The identification of tasks for ergonomic analysis was a two step process involving interviews and a survey of firefighters and paramedics from a consortium of 14 suburban fire departments. The interviews were used to generate a list of frequently performed, physically strenuous job tasks and to capture clear descriptions of those tasks and associated roles. The goals of the survey were to confirm the interview findings across the entire target population and to quantify the frequency and degree of strenuousness of each task. In turn, the quantitative results from the survey were used to prioritize job tasks for simulation. Although this process was used to study firefighters and paramedics, the approach is likely to be suitable for many other types of occupations in which the tasks are highly variable in content and irregular in frequency.

  6. Interactive computer-enhanced remote viewing system

    Energy Technology Data Exchange (ETDEWEB)

    Tourtellott, J.A.; Wagner, J.F. [Mechanical Technology Incorporated, Latham, NY (United States)

    1995-10-01

    Remediation activities such as decontamination and decommissioning (D&D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically.

  7. Abstract quantum computing machines and quantum computational logics

    Science.gov (United States)

    Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto

    2016-06-01

    Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.

  8. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  9. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  10. Perceived control in rhesus monkeys (Macaca mulatta) - Enhanced video-task performance

    Science.gov (United States)

    Washburn, David A.; Hopkins, William D.; Rumbaugh, Duane M.

    1991-01-01

    This investigation was designed to determine whether perceived control effects found in humans extend to rhesus monkeys (Macaca mulatta) tested in a video-task format, using a computer-generated menu program, SELECT. Choosing one of the options in SELECT resulted in presentation of five trials of a corresponding task and subsequent return to the menu. In Experiments 1-3, the animals exhibited stable, meaningful response patterns in this task (i.e., they made choices). In Experiment 4, performance on tasks that were selected by the animals significantly exceeded performance on identical tasks when assigned by the experimenter under comparable conditions (e.g., time of day, order, variety). The reliable and significant advantage for performance on selected tasks, typically found in humans, suggests that rhesus monkeys were able to perceive the availability of choices.

  11. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  12. Computer Aided Measurement Laser (CAML): technique to quantify post-mastectomy lymphoedema

    International Nuclear Information System (INIS)

    Trombetta, Chiara; Abundo, Paolo; Felici, Antonella; Ljoka, Concetta; Foti, Calogero; Cori, Sandro Di; Rosato, Nicola

    2012-01-01

    Lymphoedema can be a side effect of cancer treatment. Eventhough several methods for assessing lymphoedema are used in clinical practice, an objective quantification of lymphoedema has been problematic. The aim of the study was to determine the objectivity, reliability and repeatability of the computer aided measurement laser (CAML) technique. CAML technique is based on computer aided design (CAD) methods and requires an infrared laser scanner. Measurements are scanned and the information describing size and shape of the limb allows to design the model by using the CAD software. The objectivity and repeatability was established in the beginning using a phantom. Consequently a group of subjects presenting post-breast cancer lymphoedema was evaluated using as a control the contralateral limb. Results confirmed that in clinical settings CAML technique is easy to perform, rapid and provides meaningful data for assessing lymphoedema. Future research will include a comparison of upper limb CAML technique between healthy subjects and patients with known lymphoedema.

  13. Prospects of a mathematical theory of human behavior in complex man-machine systems tasks. [time sharing computer analogy of automobile driving

    Science.gov (United States)

    Johannsen, G.; Rouse, W. B.

    1978-01-01

    A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.

  14. Computing bubble-points of CO

    NARCIS (Netherlands)

    Ramdin, M.; Balaji, S.P.; Vicent Luna, J.M.; Torres-Knoop, A; Chen, Q.; Dubbeldam, D.; Calero, S; de Loos, T.W.; Vlugt, T.J.H.

    2016-01-01

    Computing bubble-points of multicomponent mixtures using Monte Carlo simulations is a non-trivial task. A new method is used to compute gas compositions from a known temperature, bubble-point pressure, and liquid composition. Monte Carlo simulations are used to calculate the bubble-points of

  15. Cognitive Approaches for Medicine in Cloud Computing.

    Science.gov (United States)

    Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia

    2018-03-03

    This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.

  16. Engineering and Computing Portal to Solve Environmental Problems

    Science.gov (United States)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  17. Learning Universal Computations with Spikes

    Science.gov (United States)

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  18. Soil structure characterized using computed tomographic images

    Science.gov (United States)

    Zhanqi Cheng; Stephen H. Anderson; Clark J. Gantzer; J. W. Van Sambeek

    2003-01-01

    Fractal analysis of soil structure is a relatively new method for quantifying the effects of management systems on soil properties and quality. The objective of this work was to explore several methods of studying images to describe and quantify structure of soils under forest management. This research uses computed tomography and a topological method called Multiple...

  19. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  20. A memory-array architecture for computer vision

    Energy Technology Data Exchange (ETDEWEB)

    Balsara, P.T.

    1989-01-01

    With the fast advances in the area of computer vision and robotics there is a growing need for machines that can understand images at a very high speed. A conventional von Neumann computer is not suited for this purpose because it takes a tremendous amount of time to solve most typical image processing problems. Exploiting the inherent parallelism present in various vision tasks can significantly reduce the processing time. Fortunately, parallelism is increasingly affordable as hardware gets cheaper. Thus it is now imperative to study computer vision in a parallel processing framework. The author should first design a computational structure which is well suited for a wide range of vision tasks and then develop parallel algorithms which can run efficiently on this structure. Recent advances in VLSI technology have led to several proposals for parallel architectures for computer vision. In this thesis he demonstrates that a memory array architecture with efficient local and global communication capabilities can be used for high speed execution of a wide range of computer vision tasks. This architecture, called the Access Constrained Memory Array Architecture (ACMAA), is efficient for VLSI implementation because of its modular structure, simple interconnect and limited global control. Several parallel vision algorithms have been designed for this architecture. The choice of vision problems demonstrates the versatility of ACMAA for a wide range of vision tasks. These algorithms were simulated on a high level ACMAA simulator running on the Intel iPSC/2 hypercube, a parallel architecture. The results of this simulation are compared with those of sequential algorithms running on a single hypercube node. Details of the ACMAA processor architecture are also presented.

  1. Quantifying the association between white matter integrity changes and subconcussive head impact exposure from a single season of youth and high school football using 3D convolutional neural networks

    Science.gov (United States)

    Saghafi, Behrouz; Murugesan, Gowtham; Davenport, Elizabeth; Wagner, Ben; Urban, Jillian; Kelley, Mireille; Jones, Derek; Powers, Alexander; Whitlow, Christopher; Stitzel, Joel; Maldjian, Joseph; Montillo, Albert

    2018-02-01

    The effect of subconcussive head impact exposure during contact sports, including American football, on brain health is poorly understood particularly in young and adolescent players, who may be more vulnerable to brain injury during periods of rapid brain maturation. This study aims to quantify the association between cumulative effects of head impact exposure from a single season of football on white matter (WM) integrity as measured with diffusion MRI. The study targets football players aged 9-18 years old. All players were imaged pre- and post-season with structural MRI and diffusion tensor MRI (DTI). Fractional Anisotropy (FA) maps, shown to be closely correlated with WM integrity, were computed for each subject, co-registered and subtracted to compute the change in FA per subject. Biomechanical metrics were collected at every practice and game using helmet mounted accelerometers. Each head impact was converted into a risk of concussion, and the risk of concussion-weighted cumulative exposure (RWE) was computed for each player for the season. Athletes with high and low RWE were selected for a two-category classification task. This task was addressed by developing a 3D Convolutional Neural Network (CNN) to automatically classify players into high and low impact exposure groups from the change in FA maps. Using the proposed model, high classification performance, including ROC Area Under Curve score of 85.71% and F1 score of 83.33% was achieved. This work adds to the growing body of evidence for the presence of detectable neuroimaging brain changes in white matter integrity from a single season of contact sports play, even in the absence of a clinically diagnosed concussion.

  2. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  3. Distinguishing bias from sensitivity effects in multialternative detection tasks.

    Science.gov (United States)

    Sridharan, Devarajan; Steinmetz, Nicholas A; Moore, Tirin; Knudsen, Eric I

    2014-08-21

    Studies investigating the neural bases of cognitive phenomena increasingly employ multialternative detection tasks that seek to measure the ability to detect a target stimulus or changes in some target feature (e.g., orientation or direction of motion) that could occur at one of many locations. In such tasks, it is essential to distinguish the behavioral and neural correlates of enhanced perceptual sensitivity from those of increased bias for a particular location or choice (choice bias). However, making such a distinction is not possible with established approaches. We present a new signal detection model that decouples the behavioral effects of choice bias from those of perceptual sensitivity in multialternative (change) detection tasks. By formulating the perceptual decision in a multidimensional decision space, our model quantifies the respective contributions of bias and sensitivity to multialternative behavioral choices. With a combination of analytical and numerical approaches, we demonstrate an optimal, one-to-one mapping between model parameters and choice probabilities even for tasks involving arbitrarily large numbers of alternatives. We validated the model with published data from two ternary choice experiments: a target-detection experiment and a length-discrimination experiment. The results of this validation provided novel insights into perceptual processes (sensory noise and competitive interactions) that can accurately and parsimoniously account for observers' behavior in each task. The model will find important application in identifying and interpreting the effects of behavioral manipulations (e.g., cueing attention) or neural perturbations (e.g., stimulation or inactivation) in a variety of multialternative tasks of perception, attention, and decision-making. © 2014 ARVO.

  4. Microcomputers, desk calculators and process computers for use in radiation protection

    International Nuclear Information System (INIS)

    Burgkhardt, B.; Nolte, G.; Schollmeier, W.; Rau, G.

    1983-01-01

    The goals achievable, or to be pursued, in radiation protection measurement and evaluation by using computers are explained. As there is a large variety of computers available offering a likewise large variety, of performances, use of a computer is justified even for minor measuring and evaluation tasks. The subdivision into: Microcomputers as an installed part of measuring equipment; measuring and evaluation systems with desk calculators; measuring and evaluation systems with process computers is done to explain the importance and extent of the measuring or evaluation tasks and the computing devices suitable for the various purposes. The special requirements to be met in order to fulfill the different tasks are discussed, both in terms of hardware and software and in terms of skill and knowledge of the personnel, and are illustrated by an example showing the usefulness of computers in radiation protection. (orig./HP) [de

  5. Method and Apparatus for Performance Optimization Through Physical Perturbation of Task Elements

    Science.gov (United States)

    Prinzel, Lawrence J., III (Inventor); Pope, Alan T. (Inventor); Palsson, Olafur S. (Inventor); Turner, Marsha J. (Inventor)

    2016-01-01

    The invention is an apparatus and method of biofeedback training for attaining a physiological state optimally consistent with the successful performance of a task, wherein the probability of successfully completing the task is made is inversely proportional to a physiological difference value, computed as the absolute value of the difference between at least one physiological signal optimally consistent with the successful performance of the task and at least one corresponding measured physiological signal of a trainee performing the task. The probability of successfully completing the task is made inversely proportional to the physiological difference value by making one or more measurable physical attributes of the environment in which the task is performed, and upon which completion of the task depends, vary in inverse proportion to the physiological difference value.

  6. Successfully Carrying out Complex Learning-Tasks through Guiding Teams' Qualitative and Quantitative Reasoning

    Science.gov (United States)

    Slof, B.; Erkens, G.; Kirschner, P. A.; Janssen, J.; Jaspers, J. G. M.

    2012-01-01

    This study investigated whether and how scripting learners' use of representational tools in a computer supported collaborative learning (CSCL)-environment fostered their collaborative performance on a complex business-economics task. Scripting the problem-solving process sequenced and made its phase-related part-task demands explicit, namely…

  7. Task exposures in an office environment: a comparison of methods.

    Science.gov (United States)

    Van Eerd, Dwayne; Hogg-Johnson, Sheilah; Mazumder, Anjali; Cole, Donald; Wells, Richard; Moore, Anne

    2009-10-01

    Task-related factors such as frequency and duration are associated with musculoskeletal disorders in office settings. The primary objective was to compare various task recording methods as measures of exposure in an office workplace. A total of 41 workers from different jobs were recruited from a large urban newspaper (71% female, mean age 41 years SD 9.6). Questionnaire, task diaries, direct observation and video methods were used to record tasks. A common set of task codes was used across methods. Different estimates of task duration, number of tasks and task transitions arose from the different methods. Self-report methods did not consistently result in longer task duration estimates. Methodological issues could explain some of the differences in estimates seen between methods observed. It was concluded that different task recording methods result in different estimates of exposure likely due to different exposure constructs. This work addresses issues of exposure measurement in office environments. It is of relevance to ergonomists/researchers interested in how to best assess the risk of injury among office workers. The paper discusses the trade-offs between precision, accuracy and burden in the collection of computer task-based exposure measures and different underlying constructs captures in each method.

  8. Review on Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    P. Sumithra

    2017-03-01

    Full Text Available Computational electromagnetics (CEM is applied to model the interaction of electromagnetic fields with the objects like antenna, waveguides, aircraft and their environment using Maxwell equations.  In this paper the strength and weakness of various computational electromagnetic techniques are discussed. Performance of various techniques in terms accuracy, memory and computational time for application specific tasks such as modeling RCS (Radar cross section, space applications, thin wires, antenna arrays are presented in this paper.

  9. Optimizing the number of steps in learning tasks for complex skills.

    Science.gov (United States)

    Nadolski, Rob J; Kirschner, Paul A; van Merriënboer, Jeroen J G

    2005-06-01

    Carrying out whole tasks is often too difficult for novice learners attempting to acquire complex skills. The common solution is to split up the tasks into a number of smaller steps. The number of steps must be optimized for efficient and effective learning. The aim of the study is to investigate the relation between the number of steps provided to learners and the quality of their learning of complex skills. It is hypothesized that students receiving an optimized number of steps will learn better than those receiving either the whole task in only one step or those receiving a large number of steps. Participants were 35 sophomore law students studying at Dutch universities, mean age=22.8 years (SD=3.5), 63% were female. Participants were randomly assigned to 1 of 3 computer-delivered versions of a multimedia programme on how to prepare and carry out a law plea. The versions differed only in the number of learning steps provided. Videotaped plea-performance results were determined, various related learning measures were acquired and all computer actions were logged and analyzed. Participants exposed to an intermediate (i.e. optimized) number of steps outperformed all others on the compulsory learning task. No differences in performance on a transfer task were found. A high number of steps proved to be less efficient for carrying out the learning task. An intermediate number of steps is the most effective, proving that the number of steps can be optimized for improving learning.

  10. Neural basis for generalized quantifier comprehension.

    Science.gov (United States)

    McMillan, Corey T; Clark, Robin; Moore, Peachie; Devita, Christian; Grossman, Murray

    2005-01-01

    Generalized quantifiers like "all cars" are semantically well understood, yet we know little about their neural representation. Our model of quantifier processing includes a numerosity device, operations that combine number elements and working memory. Semantic theory posits two types of quantifiers: first-order quantifiers identify a number state (e.g. "at least 3") and higher-order quantifiers additionally require maintaining a number state actively in working memory for comparison with another state (e.g. "less than half"). We used BOLD fMRI to test the hypothesis that all quantifiers recruit inferior parietal cortex associated with numerosity, while only higher-order quantifiers recruit prefrontal cortex associated with executive resources like working memory. Our findings showed that first-order and higher-order quantifiers both recruit right inferior parietal cortex, suggesting that a numerosity component contributes to quantifier comprehension. Moreover, only probes of higher-order quantifiers recruited right dorsolateral prefrontal cortex, suggesting involvement of executive resources like working memory. We also observed activation of thalamus and anterior cingulate that may be associated with selective attention. Our findings are consistent with a large-scale neural network centered in frontal and parietal cortex that supports comprehension of generalized quantifiers.

  11. A dual-task investigation of automaticity in visual word processing

    Science.gov (United States)

    McCann, R. S.; Remington, R. W.; Van Selst, M.

    2000-01-01

    An analysis of activation models of visual word processing suggests that frequency-sensitive forms of lexical processing should proceed normally while unattended. This hypothesis was tested by having participants perform a speeded pitch discrimination task followed by lexical decisions or word naming. As the stimulus onset asynchrony between the tasks was reduced, lexical-decision and naming latencies increased dramatically. Word-frequency effects were additive with the increase, indicating that frequency-sensitive processing was subject to postponement while attention was devoted to the other task. Either (a) the same neural hardware shares responsibility for lexical processing and central stages of choice reaction time task processing and cannot perform both computations simultaneously, or (b) lexical processing is blocked in order to optimize performance on the pitch discrimination task. Either way, word processing is not as automatic as activation models suggest.

  12. Evaluation of subjective image quality in relation to diagnostic task for cone beam computed tomography with different fields of view.

    Science.gov (United States)

    Lofthag-Hansen, Sara; Thilander-Klang, Anne; Gröndahl, Kerstin

    2011-11-01

    To evaluate subjective image quality for two diagnostic tasks, periapical diagnosis and implant planning, for cone beam computed tomography (CBCT) using different exposure parameters and fields of view (FOVs). Examinations were performed in posterior part of the jaws on a skull phantom with 3D Accuitomo (FOV 3 cm×4 cm) and 3D Accuitomo FPD (FOVs 4 cm×4 cm and 6 cm×6 cm). All combinations of 60, 65, 70, 75, 80 kV and 2, 4, 6, 8, 10 mA with a rotation of 180° and 360° were used. Dose-area product (DAP) value was determined for each combination. The images were presented, displaying the object in axial, cross-sectional and sagittal views, without scanning data in a random order for each FOV and jaw. Seven observers assessed image quality on a six-point rating scale. Intra-observer agreement was good (κw=0.76) and inter-observer agreement moderate (κw=0.52). Stepwise logistic regression showed kV, mA and diagnostic task to be the most important variables. Periapical diagnosis, regardless jaw, required higher exposure parameters compared to implant planning. Implant planning in the lower jaw required higher exposure parameters compared to upper jaw. Overall ranking of FOVs gave 4 cm×4 cm, 6 cm×6 cm followed by 3 cm×4 cm. This study has shown that exposure parameters should be adjusted according to diagnostic task. For this particular CBCT brand a rotation of 180° gave good subjective image quality, hence a substantial dose reduction can be achieved without loss of diagnostic information. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. Evaluation of subjective image quality in relation to diagnostic task for cone beam computed tomography with different fields of view

    International Nuclear Information System (INIS)

    Lofthag-Hansen, Sara; Thilander-Klang, Anne; Groendahl, Kerstin

    2011-01-01

    Aims: To evaluate subjective image quality for two diagnostic tasks, periapical diagnosis and implant planning, for cone beam computed tomography (CBCT) using different exposure parameters and fields of view (FOVs). Materials and methods: Examinations were performed in posterior part of the jaws on a skull phantom with 3D Accuitomo (FOV 3 cm x 4 cm) and 3D Accuitomo FPD (FOVs 4 cm x 4 cm and 6 cm x 6 cm). All combinations of 60, 65, 70, 75, 80 kV and 2, 4, 6, 8, 10 mA with a rotation of 180 o and 360 o were used. Dose-area product (DAP) value was determined for each combination. The images were presented, displaying the object in axial, cross-sectional and sagittal views, without scanning data in a random order for each FOV and jaw. Seven observers assessed image quality on a six-point rating scale. Results: Intra-observer agreement was good (κ w = 0.76) and inter-observer agreement moderate (κ w = 0.52). Stepwise logistic regression showed kV, mA and diagnostic task to be the most important variables. Periapical diagnosis, regardless jaw, required higher exposure parameters compared to implant planning. Implant planning in the lower jaw required higher exposure parameters compared to upper jaw. Overall ranking of FOVs gave 4 cm x 4 cm, 6 cm x 6 cm followed by 3 cm x 4 cm. Conclusions: This study has shown that exposure parameters should be adjusted according to diagnostic task. For this particular CBCT brand a rotation of 180 o gave good subjective image quality, hence a substantial dose reduction can be achieved without loss of diagnostic information.

  14. Empowering Middle School Teachers with Portable Computers.

    Science.gov (United States)

    Weast, Jerry D.; And Others

    1993-01-01

    A Sioux Falls (South Dakota) project that supplied middle school teachers with Macintosh computers and training to use them showed gratifying results. Easy access to portable notebook computers made teachers more active computer users, increased teacher interaction and collaboration, enhanced teacher productivity regarding management tasks and…

  15. Longitudinal effects of bilingualism on dual-tasking.

    Science.gov (United States)

    Sörman, Daniel Eriksson; Josefsson, Maria; Marsh, John E; Hansson, Patrik; Ljungberg, Jessica K

    2017-01-01

    An ongoing debate surrounds whether bilinguals outperform monolinguals in tests of executive processing. The aim of this study was to investigate if there are long-term (10 year) bilingual advantages in executive processing, as indexed by dual-task performance, in a sample that were 40-65 years at baseline. The bilingual (n = 24) and monolingual (n = 24) participants were matched on age, sex, education, fluid intelligence, and study sample. Participants performed free-recall for a 12-item list in three dual-task settings wherein they sorted cards either during encoding, retrieval, or during both encoding and retrieval of the word-list. Free recall without card sorting was used as a reference to compute dual-task costs. The results showed that bilinguals significantly outperformed monolinguals when they performed card-sorting during both encoding and retrieval of the word-list, the condition that presumably placed the highest demands on executive functioning. However, dual-task costs increased over time for bilinguals relative to monolinguals, a finding that is possibly influenced by retirement age and limited use of second language in the bilingual group.

  16. Task distribution mechanism for effective collaboration in virtual environments

    International Nuclear Information System (INIS)

    Khalid, S.; Ullah, S.; Alam, A.

    2016-01-01

    Collaborative Virtual Environments (CVEs) are computer generated worlds where two or more users can simultaneously interact with synthetic objects to perform a task. User performance is one of the main issues caused by either loose coordination, less awareness or communication among collaborating users. In this paper, a new model for task distribution is proposed, in which task distribution strategy among multiple users in CVEs is defined. The model assigns the task to collaborating users in CVEs either on static or dynamic basis. In static distribution there exists loose dependency and requires less communication during task realization whereas in dynamic distribution users are more dependent on each other and thus require more communication. In order to study the effect of static and dynamic task distribution strategies on user's performance in CVEs, a collaborative virtual environment is developed where twenty four (24) teams (each consists of two users) perform a task in collaboration under both strategies (static and dynamic). Results reveal that static distribution is more effective and increases users performance in CVEs. The outcome of this work will help the development of effective CVEs in the field of virtual assembly, repair, education and entertainment. (author)

  17. Comparison of continuously acquired resting state and extracted analogues from active tasks.

    Science.gov (United States)

    Ganger, Sebastian; Hahn, Andreas; Küblböck, Martin; Kranz, Georg S; Spies, Marie; Vanicek, Thomas; Seiger, René; Sladky, Ronald; Windischberger, Christian; Kasper, Siegfried; Lanzenberger, Rupert

    2015-10-01

    Functional connectivity analysis of brain networks has become an important tool for investigation of human brain function. Although functional connectivity computations are usually based on resting-state data, the application to task-specific fMRI has received growing attention. Three major methods for extraction of resting-state data from task-related signal have been proposed (1) usage of unmanipulated task data for functional connectivity; (2) regression against task effects, subsequently using the residuals; and (3) concatenation of baseline blocks located in-between task blocks. Despite widespread application in current research, consensus on which method best resembles resting-state seems to be missing. We, therefore, evaluated these techniques in a sample of 26 healthy controls measured at 7 Tesla. In addition to continuous resting-state, two different task paradigms were assessed (emotion discrimination and right finger-tapping) and five well-described networks were analyzed (default mode, thalamus, cuneus, sensorimotor, and auditory). Investigating the similarity to continuous resting-state (Dice, Intraclass correlation coefficient (ICC), R(2) ) showed that regression against task effects yields functional connectivity networks most alike to resting-state. However, all methods exhibited significant differences when compared to continuous resting-state and similarity metrics were lower than test-retest of two resting-state scans. Omitting global signal regression did not change these findings. Visually, the networks are highly similar, but through further investigation marked differences can be found. Therefore, our data does not support referring to resting-state when extracting signals from task designs, although functional connectivity computed from task-specific data may indeed yield interesting information. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  18. Robust visual tracking via structured multi-task sparse learning

    KAUST Repository

    Zhang, Tianzhu

    2012-11-09

    In this paper, we formulate object tracking in a particle filter framework as a structured multi-task sparse learning problem, which we denote as Structured Multi-Task Tracking (S-MTT). Since we model particles as linear combinations of dictionary templates that are updated dynamically, learning the representation of each particle is considered a single task in Multi-Task Tracking (MTT). By employing popular sparsity-inducing lp,q mixed norms (specifically p∈2,∞ and q=1), we regularize the representation problem to enforce joint sparsity and learn the particle representations together. As compared to previous methods that handle particles independently, our results demonstrate that mining the interdependencies between particles improves tracking performance and overall computational complexity. Interestingly, we show that the popular L1 tracker (Mei and Ling, IEEE Trans Pattern Anal Mach Intel 33(11):2259-2272, 2011) is a special case of our MTT formulation (denoted as the L11 tracker) when p=q=1. Under the MTT framework, some of the tasks (particle representations) are often more closely related and more likely to share common relevant covariates than other tasks. Therefore, we extend the MTT framework to take into account pairwise structural correlations between particles (e.g. spatial smoothness of representation) and denote the novel framework as S-MTT. The problem of learning the regularized sparse representation in MTT and S-MTT can be solved efficiently using an Accelerated Proximal Gradient (APG) method that yields a sequence of closed form updates. As such, S-MTT and MTT are computationally attractive. We test our proposed approach on challenging sequences involving heavy occlusion, drastic illumination changes, and large pose variations. Experimental results show that S-MTT is much better than MTT, and both methods consistently outperform state-of-the-art trackers. © 2012 Springer Science+Business Media New York.

  19. Integration of active pauses and pattern of muscular activity during computer work.

    Science.gov (United States)

    St-Onge, Nancy; Samani, Afshin; Madeleine, Pascal

    2017-09-01

    Submaximal isometric muscle contractions have been reported to increase variability of muscle activation during computer work; however, other types of active contractions may be more beneficial. Our objective was to determine which type of active pause vs. rest is more efficient in changing muscle activity pattern during a computer task. Asymptomatic regular computer users performed a standardised 20-min computer task four times, integrating a different type of pause: sub-maximal isometric contraction, dynamic contraction, postural exercise and rest. Surface electromyographic (SEMG) activity was recorded bilaterally from five neck/shoulder muscles. Root-mean-square decreased with isometric pauses in the cervical paraspinals, upper trapezius and middle trapezius, whereas it increased with rest. Variability in the pattern of muscular activity was not affected by any type of pause. Overall, no detrimental effects on the level of SEMG during active pauses were found suggesting that they could be implemented without a cost on activation level or variability. Practitioner Summary: We aimed to determine which type of active pause vs. rest is best in changing muscle activity pattern during a computer task. Asymptomatic computer users performed a standardised computer task integrating different types of pauses. Muscle activation decreased with isometric pauses in neck/shoulder muscles, suggesting their implementation during computer work.

  20. Interactive computer-enhanced remote viewing system

    International Nuclear Information System (INIS)

    Tourtellott, J.A.; Wagner, J.F.

    1995-01-01

    Remediation activities such as decontamination and decommissioning (D ampersand D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically

  1. Multi-task learning with group information for human action recognition

    Science.gov (United States)

    Qian, Li; Wu, Song; Pu, Nan; Xu, Shulin; Xiao, Guoqiang

    2018-04-01

    Human action recognition is an important and challenging task in computer vision research, due to the variations in human motion performance, interpersonal differences and recording settings. In this paper, we propose a novel multi-task learning framework with group information (MTL-GI) for accurate and efficient human action recognition. Specifically, we firstly obtain group information through calculating the mutual information according to the latent relationship between Gaussian components and action categories, and clustering similar action categories into the same group by affinity propagation clustering. Additionally, in order to explore the relationships of related tasks, we incorporate group information into multi-task learning. Experimental results evaluated on two popular benchmarks (UCF50 and HMDB51 datasets) demonstrate the superiority of our proposed MTL-GI framework.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  3. Quantitative evaluation of muscle synergy models: a single-trial task decoding approach.

    Science.gov (United States)

    Delis, Ioannis; Berret, Bastien; Pozzo, Thierry; Panzeri, Stefano

    2013-01-01

    Muscle synergies, i.e., invariant coordinated activations of groups of muscles, have been proposed as building blocks that the central nervous system (CNS) uses to construct the patterns of muscle activity utilized for executing movements. Several efficient dimensionality reduction algorithms that extract putative synergies from electromyographic (EMG) signals have been developed. Typically, the quality of synergy decompositions is assessed by computing the Variance Accounted For (VAF). Yet, little is known about the extent to which the combination of those synergies encodes task-discriminating variations of muscle activity in individual trials. To address this question, here we conceive and develop a novel computational framework to evaluate muscle synergy decompositions in task space. Unlike previous methods considering the total variance of muscle patterns (VAF based metrics), our approach focuses on variance discriminating execution of different tasks. The procedure is based on single-trial task decoding from muscle synergy activation features. The task decoding based metric evaluates quantitatively the mapping between synergy recruitment and task identification and automatically determines the minimal number of synergies that captures all the task-discriminating variability in the synergy activations. In this paper, we first validate the method on plausibly simulated EMG datasets. We then show that it can be applied to different types of muscle synergy decomposition and illustrate its applicability to real data by using it for the analysis of EMG recordings during an arm pointing task. We find that time-varying and synchronous synergies with similar number of parameters are equally efficient in task decoding, suggesting that in this experimental paradigm they are equally valid representations of muscle synergies. Overall, these findings stress the effectiveness of the decoding metric in systematically assessing muscle synergy decompositions in task space.

  4. Convolutional Deep Belief Networks for Single-Cell/Object Tracking in Computational Biology and Computer Vision.

    Science.gov (United States)

    Zhong, Bineng; Pan, Shengnan; Zhang, Hongbo; Wang, Tian; Du, Jixiang; Chen, Duansheng; Cao, Liujuan

    2016-01-01

    In this paper, we propose deep architecture to dynamically learn the most discriminative features from data for both single-cell and object tracking in computational biology and computer vision. Firstly, the discriminative features are automatically learned via a convolutional deep belief network (CDBN). Secondly, we design a simple yet effective method to transfer features learned from CDBNs on the source tasks for generic purpose to the object tracking tasks using only limited amount of training data. Finally, to alleviate the tracker drifting problem caused by model updating, we jointly consider three different types of positive samples. Extensive experiments validate the robustness and effectiveness of the proposed method.

  5. Hybrid glowworm swarm optimization for task scheduling in the cloud environment

    Science.gov (United States)

    Zhou, Jing; Dong, Shoubin

    2018-06-01

    In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.

  6. Emergence of motor synergy in vertical reaching task via tacit learning.

    Science.gov (United States)

    Hayashibe, Mitsuhiro; Shimoda, Shingo

    2013-01-01

    The dynamics of multijoint limbs often causes complex dynamic interaction torques which are the inertial effect of other joints motion. It is known that Cerebellum takes important role in a motor learning by developing the internal model. In this paper, we propose a novel computational control paradigm in vertical reaching task which involves the management of interaction torques and gravitational effect. The obtained results demonstrate that the proposed method is valid for acquiring motor synergy in the system with actuation redundancy and resulted in the energy efficient solutions. It is highlighted that the tacit learning in vertical reaching task can bring computational adaptability and optimality with model-free and cost-function-free approach differently from previous studies.

  7. Mental workload during brain-computer interface training.

    Science.gov (United States)

    Felton, Elizabeth A; Williams, Justin C; Vanderheiden, Gregg C; Radwin, Robert G

    2012-01-01

    It is not well understood how people perceive the difficulty of performing brain-computer interface (BCI) tasks, which specific aspects of mental workload contribute the most, and whether there is a difference in perceived workload between participants who are able-bodied and disabled. This study evaluated mental workload using the NASA Task Load Index (TLX), a multi-dimensional rating procedure with six subscales: Mental Demands, Physical Demands, Temporal Demands, Performance, Effort, and Frustration. Able-bodied and motor disabled participants completed the survey after performing EEG-based BCI Fitts' law target acquisition and phrase spelling tasks. The NASA-TLX scores were similar for able-bodied and disabled participants. For example, overall workload scores (range 0-100) for 1D horizontal tasks were 48.5 (SD = 17.7) and 46.6 (SD 10.3), respectively. The TLX can be used to inform the design of BCIs that will have greater usability by evaluating subjective workload between BCI tasks, participant groups, and control modalities. Mental workload of brain-computer interfaces (BCI) can be evaluated with the NASA Task Load Index (TLX). The TLX is an effective tool for comparing subjective workload between BCI tasks, participant groups (able-bodied and disabled), and control modalities. The data can inform the design of BCIs that will have greater usability.

  8. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  9. Productivity associated with visual status of computer users.

    Science.gov (United States)

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  10. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  11. An application for multi-person task synchronization

    Science.gov (United States)

    Brown, Robert L.; Doyle, Dee

    1990-01-01

    Computer applications are studied that will enable a group of people to synchronize their actions when following a predefined task sequence. It is assumed that the people involved only have computer workstations available to them for communication. Hence, the approach is to study how the computer can be used to help a group remain synchronized. A series of applications were designed and developed that can be used as vehicles for experimentation. An example of how this technique can be used for a remote coaching capability is explained in a report describing an experiment that simulated a Life Sciences experiment on-board Space Station Freedom, with a ground based principal investigator providing the expertise by coaching the on-orbit mission specialist.

  12. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  13. PWR FLECHT SEASET 21-rod bundle flow blockage task. Task plan report. FLECHT SEASET Program report No. 5

    International Nuclear Information System (INIS)

    Hochreiter, L.E.; Basel, R.A.; Dennis, R.J.; Lee, N.; Massie, H.W. Jr.; Loftus, M.J.; Rosal, E.R.; Valkovic, M.M.

    1980-10-01

    This report presents a descriptive plan of tests for the 21-Rod Bundle Flow Blockage Task of the Full-Length Emergency Cooling Heat Transfer Separate Effects and Systems Effects Test Program (FLECHT SEASET). This task will consist of forced and gravity reflooding tests utilizing electrical heater rods to simulate PWR nuclear core fuel rod arrays. All tests will be performed with a cosine axial power profile. These tests are planned to be used to determine effects of various flow blockage configurations (shapes and distributions) on reflooding behavior, to aid in development/assessment of computational models in predicting reflooding behavior of flow blockage configurations, and to screen flow blockage configurations for future 161-rod flow blockage bundle tests

  14. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment

    Directory of Open Access Journals (Sweden)

    Qi Liu

    2016-08-01

    Full Text Available Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs.

  15. Operating and maintenance experience with computer-based systems in nuclear power plants - A report by the PWG-1 Task Group on Computer-based Systems Important to Safety

    International Nuclear Information System (INIS)

    1998-01-01

    This report was prepared by the Task Group on Computer-based Systems Important to Safety of the Principal Working Group No. 1. Canada had a leading role in this study. Operating and Maintenance Experience with Computer-based Systems in nuclear power plants is essential for improving and upgrading against potential failures. The present report summarises the observations and findings related to the use of digital technology in nuclear power plants. It also makes recommendations for future activities in Member Countries. Continued expansion of digital technology in nuclear power reactor has resulted in new safety and licensing issues, since the existing licensing review criteria were mainly based on the analogue devices used when the plants were designed. On the industry side, a consensus approach is needed to help stabilise and standardise the treatment of digital installations and upgrades while ensuring safety and reliability. On the regulatory side, new guidelines and regulatory requirements are needed to assess digital upgrades. Upgrades or new installation issues always involve potential for system failures. They are addressed specifically in the 'hazard' or 'failure' analysis, and it is in this context that they ultimately are resolved in the design and addressed in licensing. Failure Analysis is normally performed in parallel with the design, verification and validation (V and V), and implementation activities of the upgrades. Current standards and guidelines in France, U.S. and Canada recognise the importance of failure analysis in computer-based system design. Thus failure analysis is an integral part of the design and implementation process and is aimed at evaluating potential failure modes and cause of system failures. In this context, it is essential to define 'System' as the plant system affected by the upgrade, not the 'Computer' system. The identified failures would provide input to the design process in the form of design requirements or design

  16. A Reverse Stroop Task with Mouse Tracking

    Science.gov (United States)

    Yamamoto, Naohide; Incera, Sara; McLennan, Conor T.

    2016-01-01

    In a reverse Stroop task, observers respond to the meaning of a color word irrespective of the color in which the word is printed—for example, the word red may be printed in the congruent color (red), an incongruent color (e.g., blue), or a neutral color (e.g., white). Although reading of color words in this task is often thought to be neither facilitated by congruent print colors nor interfered with incongruent print colors, this interference has been detected by using a response method that does not give any bias in favor of processing of word meanings or processing of print colors. On the other hand, evidence for the presence of facilitation in this task has been scarce, even though this facilitation is theoretically possible. By modifying the task such that participants respond to a stimulus color word by pointing to a corresponding response word on a computer screen with a mouse, the present study investigated the possibility that not only interference but also facilitation would take place in a reverse Stroop task. Importantly, in this study, participants’ responses were dynamically tracked by recording the entire trajectories of the mouse. Arguably, this method provided richer information about participants’ performance than traditional measures such as reaction time and accuracy, allowing for more detailed (and thus potentially more sensitive) investigation of facilitation and interference in the reverse Stroop task. These trajectories showed that the mouse’s approach toward correct response words was significantly delayed by incongruent print colors but not affected by congruent print colors, demonstrating that only interference, not facilitation, was present in the current task. Implications of these findings are discussed within a theoretical framework in which the strength of association between a task and its response method plays a critical role in determining how word meanings and print colors interact in reverse Stroop tasks. PMID:27199881

  17. Game-like tasks for comparative research: leveling the playing field

    Science.gov (United States)

    Washburn, D. A.; Gulledge, J. P.; Rumbaugh, D. M. (Principal Investigator)

    1995-01-01

    Game-like computer tasks offer many benefits for psychological research. In this paper, the usefulness of such tasks to bridge population differences (e.g., age, intelligence, species) is discussed and illustrated. A task called ALVIN was used to assess humans' and monkeys' working memory for sequences of colors with or without tones. Humans repeated longer lists than did the monkeys, and only humans benefited when the visual stimuli were accompanied by auditory cues. However, the monkeys did recall sequences at levels comparable to those reported elsewhere for children. Comparison of similarities and differences between the species is possible because the two groups were tested with exactly the same game-like paradigm.

  18. Stimulus-response compatibility and affective computing: A review

    NARCIS (Netherlands)

    Lemmens, P.M.C.; Haan, A. de; Galen, G.P. van; Meulenbroek, R.G.J.

    2007-01-01

    Affective computing, a human–factors effort to investigate the merits of emotions while people are working with human–computer interfaces, is gaining momentum. Measures to quantify affect (or its influences) range from EEG, to measurements of autonomic–nervous–system responses (e.g., heart rate,

  19. A comparison of symptoms after viewing text on a computer screen and hardcopy.

    Science.gov (United States)

    Chu, Christina; Rosenfield, Mark; Portello, Joan K; Benzoni, Jaclyn A; Collier, Juanita D

    2011-01-01

    Computer vision syndrome (CVS) is a complex of eye and vision problems experienced during or related to computer use. Ocular symptoms may include asthenopia, accommodative and vergence difficulties and dry eye. CVS occurs in up to 90% of computer workers, and given the almost universal use of these devices, it is important to identify whether these symptoms are specific to computer operation, or are simply a manifestation of performing a sustained near-vision task. This study compared ocular symptoms immediately following a sustained near task. 30 young, visually-normal subjects read text aloud either from a desktop computer screen or a printed hardcopy page at a viewing distance of 50 cm for a continuous 20 min period. Identical text was used in the two sessions, which was matched for size and contrast. Target viewing angle and luminance were similar for the two conditions. Immediately following completion of the reading task, subjects completed a written questionnaire asking about their level of ocular discomfort during the task. When comparing the computer and hardcopy conditions, significant differences in median symptom scores were reported with regard to blurred vision during the task (t = 147.0; p = 0.03) and the mean symptom score (t = 102.5; p = 0.04). In both cases, symptoms were higher during computer use. Symptoms following sustained computer use were significantly worse than those reported after hard copy fixation under similar viewing conditions. A better understanding of the physiology underlying CVS is critical to allow more accurate diagnosis and treatment. This will allow practitioners to optimize visual comfort and efficiency during computer operation.

  20. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  1. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  2. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  3. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  5. ImagePAD, a novel counting application for the Apple iPad, used to quantify axons in the mouse optic nerve.

    Science.gov (United States)

    Templeton, Justin P; Struebing, Felix L; Lemmon, Andrew; Geisert, Eldon E

    2014-11-01

    The present article introduces a new and easy to use counting application for the Apple iPad. The application "ImagePAD" takes advantage of the advanced user interface features offered by the Apple iOS platform, simplifying the rather tedious task of quantifying features in anatomical studies. For example, the image under analysis can be easily panned and zoomed using iOS-supported multi-touch gestures without losing the spatial context of the counting task, which is extremely important for ensuring count accuracy. This application allows one to quantify up to 5 different types of objects in a single field and output the data in a tab-delimited format for subsequent analysis. We describe two examples of the use of the application: quantifying axons in the optic nerve of the C57BL/6J mouse and determining the percentage of cells labeled with NeuN or ChAT in the retinal ganglion cell layer. For the optic nerve, contiguous images at 60× magnification were taken and transferred onto an Apple iPad. Axons were counted by tapping on the touch-sensitive screen using ImagePAD. Nine optic nerves were sampled and the number of axons in the nerves ranged from 38,872 axons to 50,196 axons with an average of 44,846 axons per nerve (SD = 3980 axons). Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Reusability Framework for Cloud Computing

    OpenAIRE

    Singh, Sukhpal; Singh, Rishideep

    2012-01-01

    Cloud based development is a challenging task for several software engineering projects, especially for those which needs development with reusability. Present time of cloud computing is allowing new professional models for using the software development. The expected upcoming trend of computing is assumed to be this cloud computing because of speed of application deployment, shorter time to market, and lower cost of operation. Until Cloud Co mputing Reusability Model is considered a fundamen...

  7. Blink rate, incomplete blinks and computer vision syndrome.

    Science.gov (United States)

    Portello, Joan K; Rosenfield, Mark; Chu, Christina A

    2013-05-01

    Computer vision syndrome (CVS), a highly prevalent condition, is frequently associated with dry eye disorders. Furthermore, a reduced blink rate has been observed during computer use. The present study examined whether post task ocular and visual symptoms are associated with either a decreased blink rate or a higher prevalence of incomplete blinks. An additional trial tested whether increasing the blink rate would reduce CVS symptoms. Subjects (N = 21) were required to perform a continuous 15-minute reading task on a desktop computer at a viewing distance of 50 cm. Subjects were videotaped during the task to determine their blink rate and amplitude. Immediately after the task, subjects completed a questionnaire regarding ocular symptoms experienced during the trial. In a second session, the blink rate was increased by means of an audible tone that sounded every 4 seconds, with subjects being instructed to blink on hearing the tone. The mean blink rate during the task without the audible tone was 11.6 blinks per minute (SD, 7.84). The percentage of blinks deemed incomplete for each subject ranged from 0.9 to 56.5%, with a mean of 16.1% (SD, 15.7). A significant positive correlation was observed between the total symptom score and the percentage of incomplete blinks during the task (p = 0.002). Furthermore, a significant negative correlation was noted between the blink score and symptoms (p = 0.035). Increasing the mean blink rate to 23.5 blinks per minute by means of the audible tone did not produce a significant change in the symptom score. Whereas CVS symptoms are associated with a reduced blink rate, the completeness of the blink may be equally significant. Because instructing a patient to increase his or her blink rate may be ineffective or impractical, actions to achieve complete corneal coverage during blinking may be more helpful in alleviating symptoms during computer operation.

  8. Quantifying human exposure to air pollution - moving from static monitoring to spatio-temporally resolved personal exposure assessment

    DEFF Research Database (Denmark)

    Steinle, Susanne; Reis, Stefan; Sabel, Clive E

    2013-01-01

    exposure studies to accurately assess human health risks. ? We discuss potential and shortcomings of methods and tools with a focus on how their development influences study design. ? We propose a novel conceptual model for integrated health impact assessment of human exposure to air pollutants. ? We......Quantifying human exposure to air pollutants is a challenging task. Ambient concentrations of air pollutants at potentially harmful levels are ubiquitous in urban areas and subject to high spatial and temporal variability. At the same time, every individual has unique activity-patterns. Exposure...... results from multifaceted relationships and interactions between environmental and human systems, adding complexity to the assessment process. Traditionally, approaches to quantify human exposure have relied on pollutant concentrations from fixed air quality network sites and static population...

  9. Development of an algorithm for quantifying extremity biological tissue

    International Nuclear Information System (INIS)

    Pavan, Ana L.M.; Miranda, Jose R.A.; Pina, Diana R. de

    2013-01-01

    The computerized radiology (CR) has become the most widely used device for image acquisition and production, since its introduction in the 80s. The detection and early diagnosis, obtained via CR, are important for the successful treatment of diseases such as arthritis, metabolic bone diseases, tumors, infections and fractures. However, the standards used for optimization of these images are based on international protocols. Therefore, it is necessary to compose radiographic techniques for CR system that provides a secure medical diagnosis, with doses as low as reasonably achievable. To this end, the aim of this work is to develop a quantifier algorithm of tissue, allowing the construction of a homogeneous end used phantom to compose such techniques. It was developed a database of computed tomography images of hand and wrist of adult patients. Using the Matlab ® software, was developed a computational algorithm able to quantify the average thickness of soft tissue and bones present in the anatomical region under study, as well as the corresponding thickness in simulators materials (aluminium and lucite). This was possible through the application of mask and Gaussian removal technique of histograms. As a result, was obtained an average thickness of soft tissue of 18,97 mm and bone tissue of 6,15 mm, and their equivalents in materials simulators of 23,87 mm of acrylic and 1,07mm of aluminum. The results obtained agreed with the medium thickness of biological tissues of a patient's hand pattern, enabling the construction of an homogeneous phantom

  10. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  11. Women and computers: effects of stereotype threat on attribution of failure

    OpenAIRE

    Koch, Sabine C.; Müller, Stephanie M.; Sieverding, Monika

    2008-01-01

    This study investigated whether stereotype threat can influence women’s attributions of failure in a computer task. Male and female college-age students (n = 86, 16–21 years old) from Germany were asked to work on a computer task and were hinted beforehand that in this task, either (a) men usually perform better than women do (negative threat condition), or (b) women usually perform better than men do (positive condition), or (c) they received no threat or gender-related information (contr...

  12. Northeast Artificial Intelligence Consortium (NAIC) Review of Technical Tasks. Volume 2, Part 2.

    Science.gov (United States)

    1987-07-01

    1979. 20. Nau, Dana S., "Expert Computer Systems", I7E7 Computer pp. 63-85, January 1983. 477...Edinburgh, -v .. 1979. 20. Nau, Dana S., "Expert Computer Systems,- IEEE Computer, pp. 63-85, January 1983. 21. Nilsson, Nils J., ___ -. Artificial...Percept... Saltzman , E.L. and Kelso, J.A.S., (1983), "Skilled Actions: A Task Dynamic Approach", Haskins Labs, SR-76, pp. 3-50. Seneff, Stephanie, (1985

  13. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  14. ELASTIC CLOUD COMPUTING ARCHITECTURE AND SYSTEM FOR HETEROGENEOUS SPATIOTEMPORAL COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Shi

    2017-10-01

    Full Text Available Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs, while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  15. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    Science.gov (United States)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  16. Computer Aided Battery Engineering Consortium

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad

    2016-06-07

    A multi-national lab collaborative team was assembled that includes experts from academia and industry to enhance recently developed Computer-Aided Battery Engineering for Electric Drive Vehicles (CAEBAT)-II battery crush modeling tools and to develop microstructure models for electrode design - both computationally efficient. Task 1. The new Multi-Scale Multi-Domain model framework (GH-MSMD) provides 100x to 1,000x computation speed-up in battery electrochemical/thermal simulation while retaining modularity of particles and electrode-, cell-, and pack-level domains. The increased speed enables direct use of the full model in parameter identification. Task 2. Mechanical-electrochemical-thermal (MECT) models for mechanical abuse simulation were simultaneously coupled, enabling simultaneous modeling of electrochemical reactions during the short circuit, when necessary. The interactions between mechanical failure and battery cell performance were studied, and the flexibility of the model for various batteries structures and loading conditions was improved. Model validation is ongoing to compare with test data from Sandia National Laboratories. The ABDT tool was established in ANSYS. Task 3. Microstructural modeling was conducted to enhance next-generation electrode designs. This 3- year project will validate models for a variety of electrodes, complementing Advanced Battery Research programs. Prototype tools have been developed for electrochemical simulation and geometric reconstruction.

  17. Mental workload and cognitive task automaticity: an evaluation of subjective and time estimation metrics.

    Science.gov (United States)

    Liu, Y; Wickens, C D

    1994-11-01

    The evaluation of mental workload is becoming increasingly important in system design and analysis. The present study examined the structure and assessment of mental workload in performing decision and monitoring tasks by focusing on two mental workload measurements: subjective assessment and time estimation. The task required the assignment of a series of incoming customers to the shortest of three parallel service lines displayed on a computer monitor. The subject was either in charge of the customer assignment (manual mode) or was monitoring an automated system performing the same task (automatic mode). In both cases, the subjects were required to detect the non-optimal assignments that they or the computer had made. Time pressure was manipulated by the experimenter to create fast and slow conditions. The results revealed a multi-dimensional structure of mental workload and a multi-step process of subjective workload assessment. The results also indicated that subjective workload was more influenced by the subject's participatory mode than by the factor of task speed. The time estimation intervals produced while performing the decision and monitoring tasks had significantly greater length and larger variability than those produced while either performing no other tasks or performing a well practised customer assignment task. This result seemed to indicate that time estimation was sensitive to the presence of perceptual/cognitive demands, but not to response related activities to which behavioural automaticity has developed.

  18. Quantifying coordination among the rearfoot, midfoot, and forefoot segments during running.

    Science.gov (United States)

    Takabayashi, Tomoya; Edama, Mutsuaki; Yokoyama, Erika; Kanaya, Chiaki; Kubo, Masayoshi

    2018-03-01

    Because previous studies have suggested that there is a relationship between injury risk and inter-segment coordination, quantifying coordination between the segments is essential. Even though the midfoot and forefoot segments play important roles in dynamic tasks, previous studies have mostly focused on coordination between the shank and rearfoot segments. This study aimed to quantify coordination among rearfoot, midfoot, and forefoot segments during running. Eleven healthy young men ran on a treadmill. The coupling angle, representing inter-segment coordination, was calculated using a modified vector coding technique. The coupling angle was categorised into four coordination patterns. During the absorption phase, rearfoot-midfoot coordination in the frontal planes was mostly in-phase (rearfoot and midfoot eversion with similar amplitudes). The present study found that the eversion of the midfoot with respect to the rearfoot was comparable in magnitude to the eversion of the rearfoot with respect to the shank. A previous study has suggested that disruption of the coordination between the internal rotation of the shank and eversion of the rearfoot leads to running injuries such as anterior knee pain. Thus, these data might be used in the future to compare to individuals with foot deformities or running injuries.

  19. Computer control of fuel handling activities at FFTF

    International Nuclear Information System (INIS)

    Romrell, D.M.

    1985-03-01

    The Fast Flux Test Facility near Richland, Washington, utilizes computer control for reactor refueling and other related core component handling and processing tasks. The computer controlled tasks described in this paper include core component transfers within the reactor vessel, core component transfers into and out of the reactor vessel, remote duct measurements of irradiated core components, remote duct cutting, and finally, transferring irradiated components out of the reactor containment building for off-site shipments or to long term storage. 3 refs., 16 figs

  20. Manual asymmetries in bimanual isochronous tapping tasks in children.

    Science.gov (United States)

    Faria, Inês; Diniz, Ana; Barreiros, João

    2017-01-01

    Tapping tasks have been investigated throughout the years, with variations in features such as the complexity of the task, the use of one or both hands, the employ of auditory or visual stimuli, and the characteristics of the subjects. The evaluation of lateral asymmetries in tapping tasks in children offers an insight into the structure of rhythmic movements and handedness at early stages of development. The current study aims to investigate the ability of children (aged six and seven years-old) to maintain a rhythm, in a bimanual tapping task at two different target frequencies, as well as the manual asymmetries displayed while doing so. The analyzed data in this work are the series of the time intervals between successive taps. We suggest several profiles of behavior, regarding the overall performance of children in both tempo conditions. We also propose a new method of quantifying the variability of the performance and the asymmetry of the hands, based on ellipses placed on scatter plots of the non-dominant-dominant series versus the dominant-non-dominant series. We then use running correlations to identify changes of coordination tendencies over time. The main results show that variability is larger in the task with the longer target interval. Furthermore, most children evidence lateral asymmetries, but in general they show the capacity to maintain the mean of consecutive intertap intervals of both hands close to the target interval. Finally, we try to interpret our findings in the light of existing models and timing modes. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Quantifying Transmission.

    Science.gov (United States)

    Woolhouse, Mark

    2017-07-01

    Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.

  2. Computing With Quantum Mechanical Oscillators

    National Research Council Canada - National Science Library

    Parks, A

    1991-01-01

    Despite the obvious practical considerations (e.g., stability, controllability), certain quantum mechanical systems seem to naturally lend themselves in a theoretical sense to the task of performing computations...

  3. Multi-task feature selection in microarray data by binary integer programming.

    Science.gov (United States)

    Lan, Liang; Vucetic, Slobodan

    2013-12-20

    A major challenge in microarray classification is that the number of features is typically orders of magnitude larger than the number of examples. In this paper, we propose a novel feature filter algorithm to select the feature subset with maximal discriminative power and minimal redundancy by solving a quadratic objective function with binary integer constraints. To improve the computational efficiency, the binary integer constraints are relaxed and a low-rank approximation to the quadratic term is applied. The proposed feature selection algorithm was extended to solve multi-task microarray classification problems. We compared the single-task version of the proposed feature selection algorithm with 9 existing feature selection methods on 4 benchmark microarray data sets. The empirical results show that the proposed method achieved the most accurate predictions overall. We also evaluated the multi-task version of the proposed algorithm on 8 multi-task microarray datasets. The multi-task feature selection algorithm resulted in significantly higher accuracy than when using the single-task feature selection methods.

  4. Implications of Ubiquitous Computing for the Social Studies Curriculum

    Science.gov (United States)

    van Hover, Stephanie D.; Berson, Michael J.; Bolick, Cheryl Mason; Swan, Kathleen Owings

    2004-01-01

    In March 2002, members of the National Technology Leadership Initiative (NTLI) met in Charlottesville, Virginia to discuss the potential effects of ubiquitous computing on the field of education. Ubiquitous computing, or "on-demand availability of task-necessary computing power," involves providing every student with a handheld computer--a…

  5. Multi-task transfer learning deep convolutional neural network: application to computer-aided diagnosis of breast cancer on mammograms

    Science.gov (United States)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Helvie, Mark A.; Cha, Kenny H.; Richter, Caleb D.

    2017-12-01

    Transfer learning in deep convolutional neural networks (DCNNs) is an important step in its application to medical imaging tasks. We propose a multi-task transfer learning DCNN with the aim of translating the ‘knowledge’ learned from non-medical images to medical diagnostic tasks through supervised training and increasing the generalization capabilities of DCNNs by simultaneously learning auxiliary tasks. We studied this approach in an important application: classification of malignant and benign breast masses. With Institutional Review Board (IRB) approval, digitized screen-film mammograms (SFMs) and digital mammograms (DMs) were collected from our patient files and additional SFMs were obtained from the Digital Database for Screening Mammography. The data set consisted of 2242 views with 2454 masses (1057 malignant, 1397 benign). In single-task transfer learning, the DCNN was trained and tested on SFMs. In multi-task transfer learning, SFMs and DMs were used to train the DCNN, which was then tested on SFMs. N-fold cross-validation with the training set was used for training and parameter optimization. On the independent test set, the multi-task transfer learning DCNN was found to have significantly (p  =  0.007) higher performance compared to the single-task transfer learning DCNN. This study demonstrates that multi-task transfer learning may be an effective approach for training DCNN in medical imaging applications when training samples from a single modality are limited.

  6. Towards Better Computational Models of the Balance Scale Task: A Reply to Shultz and Takane

    Science.gov (United States)

    van der Maas, Han L. J.; Quinlan, Philip T.; Jansen, Brenda R. J.

    2007-01-01

    In contrast to Shultz and Takane [Shultz, T.R., & Takane, Y. (2007). Rule following and rule use in the balance-scale task. "Cognition", in press, doi:10.1016/j.cognition.2006.12.004.] we do not accept that the traditional Rule Assessment Method (RAM) of scoring responses on the balance scale task has advantages over latent class analysis (LCA):…

  7. Quantifying vertical stress transmission and compaction-induced soil structure using sensor mat and X-ray computed tomography

    DEFF Research Database (Denmark)

    Naveed, Muhammad; Schjønning, Per; Keller, Thomas

    2016-01-01

    tillage. In this study, partially confined uniaxial compression tests were carried out on intact topsoil columns placed on subsoil columns. Two methods were employed for estimation of stress transmission in soil: (i) soil deformation patterns were quantified using X-ray CT and converted to stress......Accurate estimation of stress transmission in soil and quantification of compaction-induced soil pore structure is important for efficient soil use and management. Continuum mechanics have so far mostly been applied for agricultural soils, even if topsoil structure is aggregated due to regular...... distributions, and (ii) a tactile sensor mat was employed for measuring stresses at the interface of the topsoil and subsoil columns. The resulting soil pore structure under applied stresses was quantified using X-ray CT and by air-permeability measurements. In topsoil discrete stress transmission patterns were...

  8. Data structures, computer graphics, and pattern recognition

    CERN Document Server

    Klinger, A; Kunii, T L

    1977-01-01

    Data Structures, Computer Graphics, and Pattern Recognition focuses on the computer graphics and pattern recognition applications of data structures methodology.This book presents design related principles and research aspects of the computer graphics, system design, data management, and pattern recognition tasks. The topics include the data structure design, concise structuring of geometric data for computer aided design, and data structures for pattern recognition algorithms. The survey of data structures for computer graphics systems, application of relational data structures in computer gr

  9. Unique sensor fusion system for coordinate-measuring machine tasks

    Science.gov (United States)

    Nashman, Marilyn; Yoshimi, Billibon; Hong, Tsai Hong; Rippey, William G.; Herman, Martin

    1997-09-01

    This paper describes a real-time hierarchical system that fuses data from vision and touch sensors to improve the performance of a coordinate measuring machine (CMM) used for dimensional inspection tasks. The system consists of sensory processing, world modeling, and task decomposition modules. It uses the strengths of each sensor -- the precision of the CMM scales and the analog touch probe and the global information provided by the low resolution camera -- to improve the speed and flexibility of the inspection task. In the experiment described, the vision module performs all computations in image coordinate space. The part's boundaries are extracted during an initialization process and then the probe's position is continuously updated as it scans and measures the part surface. The system fuses the estimated probe velocity and distance to the part boundary in image coordinates with the estimated velocity and probe position provided by the CMM controller. The fused information provides feedback to the monitor controller as it guides the touch probe to scan the part. We also discuss integrating information from the vision system and the probe to autonomously collect data for 2-D to 3-D calibration, and work to register computer aided design (CAD) models with images of parts in the workplace.

  10. Neurofeedback training improves the dual-task performance ability in stroke patients.

    Science.gov (United States)

    Lee, Young-Shin; Bae, Sea-Hyun; Lee, Sung-Hee; Kim, Kyung-Yoon

    2015-05-01

    Owing to the reduced capacity for information processing following a stroke, patients commonly present with difficulties in performing activities of daily living that combine two or more tasks. To address this problem, in the present study, we investigated the effects of neurofeedback training on the abilities of stroke patients to perform dual motor tasks. We randomly assigned 20 patients who had sustained a stroke within the preceding 6 months to either a pseudo-neurofeedback (n = 10) or neurofeedback (n = 10) group. Both groups participated in a general exercise intervention for 8 weeks, three times a week for 30 min per session, under the same conditions. An electrode was secured to the scalp over the region of the central lobe (Cz), in compliance with the International 10-20 System. The electrode was inactive for the pseudo-training group. Participants in the neurofeedback training group received the 30-min neurofeedback training per session for reinforcing the sensorimotor rhythm. Electroencephalographic activity of the two groups was compared. In addition, selected parameters of gait (velocity, cadence [step/min], stance phase [%], and foot pressure) were analyzed using a 10-m walk test, attention-demanding task, walk task and quantified by the SmartStep system. The neurofeedback group showed significantly improved the regulation of the sensorimotor rhythm (p neurofeedback training is effective to improve the dual-task performance in stroke patients.

  11. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    Science.gov (United States)

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  12. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  13. Identifying objective criterion to determine a complicated task – A comparative study

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea

    2015-01-01

    Highlights: • Reliable estimation on the likelihood of human error is very critical. • Still there is no clear and objective criterion on a complicated task. • Subjective difficulty scores rated by 75 high speed train drivers are collected. • Collected difficulty scores are compared with the associated TACOM scores. • Criteria for task complexity level seem to be determined by the TACOM measure. - Abstract: A reliable estimation on the likelihood of human error is very critical for evaluating the safety of a large process control system such as NPPs (Nuclear Power Plants). In this regard, one of the determinants is to decide the level of an important PSF (Performance Shaping Factor) through a clear and objective manner along with the context of a given task. Unfortunately, it seems that there are no such decision criteria for certain PSFs including the complexity of a task. Therefore, the feasibility of the TACOM (Task Complexity) measure in providing objective criteria that are helpful for distinguishing the level of a task complexity is investigated in this study. To this end, subjective difficulty scores rated by 75 high-speed train drivers are collected for 38 tasks. After that, subjective difficulty scores are compared with the associated TACOM scores being quantified based on these tasks. As a result, it is observed that there is a significant correlation between subjective difficulty scores rated by high-speed train drivers and the associated TACOM scores. Accordingly, it is promising to expect that the TACOM measure can be used as an objective tool to identify the level of a task complexity in terms of an HRA (Human Reliability Analysis)

  14. A comprehensive study of task coalescing for selecting parallelism granularity in a two-stage bidiagonal reduction

    KAUST Repository

    Haidar, Azzam

    2012-05-01

    We present new high performance numerical kernels combined with advanced optimization techniques that significantly increase the performance of parallel bidiagonal reduction. Our approach is based on developing efficient fine-grained computational tasks as well as reducing overheads associated with their high-level scheduling during the so-called bulge chasing procedure that is an essential phase of a scalable bidiagonalization procedure. In essence, we coalesce multiple tasks in a way that reduces the time needed to switch execution context between the scheduler and useful computational tasks. At the same time, we maintain the crucial information about the tasks and their data dependencies between the coalescing groups. This is the necessary condition to preserve numerical correctness of the computation. We show our annihilation strategy based on multiple applications of single orthogonal reflectors. Despite non-trivial characteristics in computational complexity and memory access patterns, our optimization approach smoothly applies to the annihilation scenario. The coalescing positively influences another equally important aspect of the bulge chasing stage: the memory reuse. For the tasks within the coalescing groups, the data is retained in high levels of the cache hierarchy and, as a consequence, operations that are normally memory-bound increase their ratio of computation to off-chip communication and become compute-bound which renders them amenable to efficient execution on multicore architectures. The performance for the new two-stage bidiagonal reduction is staggering. Our implementation results in up to 50-fold and 12-fold improvement (∼130 Gflop/s) compared to the equivalent routines from LAPACK V3.2 and Intel MKL V10.3, respectively, on an eight socket hexa-core AMD Opteron multicore shared-memory system with a matrix size of 24000 x 24000. Last but not least, we provide a comprehensive study on the impact of the coalescing group size in terms of cache

  15. Physiological evidence of interpersonal dynamics in a cooperative production task

    DEFF Research Database (Denmark)

    Mønster, Dan; Håkonsson, Dorthe Døjbak; Eskildsen, Jacob Kjær

    2016-01-01

    Recent research suggests that shared behavioral dynamics during interpersonal interaction are indicative of subjective and objective outcomes of the interaction, such as feelings of rapport and success of performance. The role of shared physiological dynamics to quantify interpersonal interaction...... production task. Moreover, high team synchrony is found indicative of team cohesion, while low team synchrony is found indicative of a teams' decision to adopt a new behavior across multiple production sessions. We conclude that team-level measures of synchrony offer new and complementary information...

  16. Cloud Computing Security Latest Issues amp Countermeasures

    OpenAIRE

    Shelveen Pandey; Mohammed Farik

    2015-01-01

    Abstract Cloud computing describes effective computing services provided by a third-party organization known as cloud service provider for organizations to perform different tasks over the internet for a fee. Cloud service providers computing resources are dynamically reallocated per demand and their infrastructure platform and software and other resources are shared by multiple corporate and private clients. With the steady increase in the number of cloud computing subscribers of these shar...

  17. ImagePAD, a Novel Counting Application for the Apple iPad®, Used to Quantify Axons in the Mouse Optic Nerve

    Science.gov (United States)

    Templeton, Justin P.; Struebing, Felix L.; Lemmon, Andrew; Geisert, Eldon E.

    2014-01-01

    The present article introduces a new and easy to use counting application for the Apple iPad. The application “ImagePAD” takes advantage of the advanced user interface features offered by the Apple iOS® platform, simplifying the rather tedious task of quantifying features in anatomical studies. For example, the image under analysis can be easily panned and zoomed using iOS-supported multi-touch gestures without losing the spatial context of the counting task, which is extremely important for ensuring count accuracy. This application allows one to quantify up to 5 different types of objects in a single field and output the data in a tab-delimited format for subsequent analysis. We describe two examples of the use of the application: quantifying axons in the optic nerve of the C57BL/6J mouse and determining the percentage of cells labeled with NeuN or ChAT in the retinal ganglion cell layer. For the optic nerve, contiguous images at 60× magnification were taken and transferred onto an Apple iPad®. Axons were counted by tapping on the touch-sensitive screen using ImagePAD. Nine optic nerves were sampled and the number of axons in the nerves ranged from 38872 axons to 50196 axons with an average of 44846 axons per nerve (SD = 3980 axons). PMID:25281829

  18. The effect of physical and psychosocial loads on the trapezius muscle activity during computer keying tasks and rest periods

    DEFF Research Database (Denmark)

    Blangsted, Anne Katrine; Søgaard, Karen; Christensen, Hanne

    2004-01-01

    hand keying task-interspaced with short (30 s) and long (4 min) breaks-in sessions with and without a combination of cognitive and emotional stressors. Adding psychosocial loads to the same physical work did not increase the activity of the trapezius muscle on either the keying or the control side......The overall aim was to investigate the effect of psychosocial loads on trapezius muscle activity during computer keying work and during short and long breaks. In 12 female subjects, surface electromyography (EMG) was recorded bilaterally from the upper trapezius muscle during a standardized one...... resting level. During both short and long breaks, exposure to psychosocial loads also did not increase the activity of the trapezius muscle either on the side of the keying or the control hand. Of note is that during long breaks the muscle activity of the keying side as well as that of the control side...

  19. Segment Fixed Priority Scheduling for Self Suspending Real Time Tasks

    Science.gov (United States)

    2016-08-11

    a compute- intensive system such as a self - driving car that we have recently developed [28]. Such systems run computation-demanding algorithms...Applications. In RTSS, 2012. [12] J. Kim et al. Parallel Scheduling for Cyber-Physical Systems: Analysis and Case Study on a Self - Driving Car . In ICCPS...leveraging GPU can be modeled using a multi-segment self -suspending real-time task model. For example, a planning algorithm for autonomous driving can

  20. A new algorithm for histopathological diagnosis of periprosthetic infection using CD15 focus score and computer program CD15 Quantifier

    Directory of Open Access Journals (Sweden)

    V. Krenn

    2015-01-01

    Full Text Available Introduction. A simple microscopic diagnostic quantification system for neutrophile granulocytes (NG was developed evaluating a single focal point (CD15 focus score which enables the detection of bacterial infection in SLIM (synoviallike interface membrane Additionally a diagnostic algorithm is proposed how to use the CD15 focus score and the quantification software (CD15 Quantifier. Methods. 91 SLIM removed during revision surgery for histopathological diagnosis (hip; n=59 and knee; n=32 underwent histopathological classification according to the SLIM-consensus classification. NG where identified immunohistochemically by means of a CD15-specific monoclonal antibody exhibiting an intense granular cytoplasmic staining pattern. This pattern is different from CD15 expression in macrophages showing a pale and homogenous expression in mononuclear cells. The quantitative evaluation of CD15-positive neutrophils granulocytes (CD15NG used the principle of maximum focal infiltration (focus together with an assessment of a single focal point (approximately 0.3 mm2. This immunohistochemical data made it possible to develop CD15 Quantifier software which automatically quantifies CD15NG. Results. SLIM-cases with positive microbiological diagnosis (n=47 have significantly (p<0.001, Mann-Whitney U test more CD15NG/focal point than cases with negative microbiological diagnosis (n=44. 50 CD15NG/focal point were identified as the optimum threshold when diagnosing infection of periprosthetic joints using the CD15 focus score. If the microbiological findings are used as a ‘gold standard’ the diagnostic sensitivity is 0.83, specificity is 0.864. (PPV: 0.87; NPV: 0.83; accuracy 0.846; AUC: 0.878. The evaluation findings for the preparations using the CD15 Quantifier (n=31 deviated in an average of 12 cells from the histopathological evaluation findings (CD15focus score. From a cell-count greater 62 CD15 Quantifier needs on average 32 seconds less than the

  1. Training improves laparoscopic tasks performance and decreases operator workload.

    Science.gov (United States)

    Hu, Jesse S L; Lu, Jirong; Tan, Wee Boon; Lomanto, Davide

    2016-05-01

    It has been postulated that increased operator workload during task performance may increase fatigue and surgical errors. The National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is a validated tool for self-assessment for workload. Our study aims to assess the relationship of workload and performance of novices in simulated laparoscopic tasks of different complexity levels before and after training. Forty-seven novices without prior laparoscopic experience were recruited in a trial to investigate whether training improves task performance as well as mental workload. The participants were tested on three standard tasks (ring transfer, precision cutting and intracorporeal suturing) in increasing complexity based on the Fundamentals of Laparoscopic Surgery (FLS) curriculum. Following a period of training and rest, participants were tested again. Test scores were computed from time taken and time penalties for precision errors. Test scores and NASA-TLX scores were recorded pre- and post-training and analysed using paired t tests. One-way repeated measures ANOVA was used to analyse differences in NASA-TLX scores between the three tasks. NASA-TLX score was lowest with ring transfer and highest with intracorporeal suturing. This was statistically significant in both pre-training (p NASA-TLX scores mirror the changes in test scores for the three tasks. Workload scores decreased significantly after training for all three tasks (ring transfer = 2.93, p NASA-TLX score is an accurate reflection of the complexity of simulated laparoscopic tasks in the FLS curriculum. This also correlates with the relationship of test scores between the three tasks. Simulation training improves both performance score and workload score across the tasks.

  2. Application of a fast skyline computation algorithm for serendipitous searching problems

    Science.gov (United States)

    Koizumi, Kenichi; Hiraki, Kei; Inaba, Mary

    2018-02-01

    Skyline computation is a method of extracting interesting entries from a large population with multiple attributes. These entries, called skyline or Pareto optimal entries, are known to have extreme characteristics that cannot be found by outlier detection methods. Skyline computation is an important task for characterizing large amounts of data and selecting interesting entries with extreme features. When the population changes dynamically, the task of calculating a sequence of skyline sets is called continuous skyline computation. This task is known to be difficult to perform for the following reasons: (1) information of non-skyline entries must be stored since they may join the skyline in the future; (2) the appearance or disappearance of even a single entry can change the skyline drastically; (3) it is difficult to adopt a geometric acceleration algorithm for skyline computation tasks with high-dimensional datasets. Our new algorithm called jointed rooted-tree (JR-tree) manages entries using a rooted tree structure. JR-tree delays extend the tree to deep levels to accelerate tree construction and traversal. In this study, we presented the difficulties in extracting entries tagged with a rare label in high-dimensional space and the potential of fast skyline computation in low-latency cell identification technology.

  3. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  4. R for cloud computing an approach for data scientists

    CERN Document Server

    Ohri, A

    2014-01-01

    R for Cloud Computing looks at some of the tasks performed by business analysts on the desktop (PC era)  and helps the user navigate the wealth of information in R and its 4000 packages as well as transition the same analytics using the cloud.  With this information the reader can select both cloud vendors  and the sometimes confusing cloud ecosystem as well  as the R packages that can help process the analytical tasks with minimum effort and cost, and maximum usefulness and customization. The use of Graphical User Interfaces (GUI)  and Step by Step screenshot tutorials is emphasized in this book to lessen the famous learning curve in learning R and some of the needless confusion created in cloud computing that hinders its widespread adoption. This will help you kick-start analytics on the cloud including chapters on cloud computing, R, common tasks performed in analytics, scrutiny of big data analytics, and setting up and navigating cloud providers. Readers are exposed to a breadth of cloud computing ch...

  5. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  6. A Survey on PageRank Computing

    OpenAIRE

    Berkhin, Pavel

    2005-01-01

    This survey reviews the research related to PageRank computing. Components of a PageRank vector serve as authority weights for web pages independent of their textual content, solely based on the hyperlink structure of the web. PageRank is typically used as a web search ranking component. This defines the importance of the model and the data structures that underly PageRank processing. Computing even a single PageRank is a difficult computational task. Computing many PageRanks is a much mor...

  7. Effects of noise and task loading on a communication task loading on a communication task

    Science.gov (United States)

    Orrell, Dean H., II

    Previous research had shown the effect of noise on a single communication task. This research has been criticized as not being representative of a real world situation since subjects allocated all of their attention to only one task. In the present study, the effect of adding a loading task to a standard noise-communication paradigm was investigated. Subjects performed both a communication task (Modified Rhyme Test; House et al. 1965) and a short term memory task (Sternberg, 1969) in simulated levels of aircraft noise (95, 105 and 115 dB overall sound pressure level (OASPL)). Task loading was varied with Sternberg's task by requiring subjects to memorize one, four, or six alphanumeric characters. Simulated aircraft noise was varied between levels of 95, 105 and 115 dB OASPL using a pink noise source. Results show that the addition of Sternberg's task and little effect on the intelligibility of the communication task while response time for the communication task increased.

  8. Who multi-tasks and why? Multi-tasking ability, perceived multi-tasking ability, impulsivity, and sensation seeking.

    Science.gov (United States)

    Sanbonmatsu, David M; Strayer, David L; Medeiros-Ward, Nathan; Watson, Jason M

    2013-01-01

    The present study examined the relationship between personality and individual differences in multi-tasking ability. Participants enrolled at the University of Utah completed measures of multi-tasking activity, perceived multi-tasking ability, impulsivity, and sensation seeking. In addition, they performed the Operation Span in order to assess their executive control and actual multi-tasking ability. The findings indicate that the persons who are most capable of multi-tasking effectively are not the persons who are most likely to engage in multiple tasks simultaneously. To the contrary, multi-tasking activity as measured by the Media Multitasking Inventory and self-reported cell phone usage while driving were negatively correlated with actual multi-tasking ability. Multi-tasking was positively correlated with participants' perceived ability to multi-task ability which was found to be significantly inflated. Participants with a strong approach orientation and a weak avoidance orientation--high levels of impulsivity and sensation seeking--reported greater multi-tasking behavior. Finally, the findings suggest that people often engage in multi-tasking because they are less able to block out distractions and focus on a singular task. Participants with less executive control--low scorers on the Operation Span task and persons high in impulsivity--tended to report higher levels of multi-tasking activity.

  9. Who multi-tasks and why? Multi-tasking ability, perceived multi-tasking ability, impulsivity, and sensation seeking.

    Directory of Open Access Journals (Sweden)

    David M Sanbonmatsu

    Full Text Available The present study examined the relationship between personality and individual differences in multi-tasking ability. Participants enrolled at the University of Utah completed measures of multi-tasking activity, perceived multi-tasking ability, impulsivity, and sensation seeking. In addition, they performed the Operation Span in order to assess their executive control and actual multi-tasking ability. The findings indicate that the persons who are most capable of multi-tasking effectively are not the persons who are most likely to engage in multiple tasks simultaneously. To the contrary, multi-tasking activity as measured by the Media Multitasking Inventory and self-reported cell phone usage while driving were negatively correlated with actual multi-tasking ability. Multi-tasking was positively correlated with participants' perceived ability to multi-task ability which was found to be significantly inflated. Participants with a strong approach orientation and a weak avoidance orientation--high levels of impulsivity and sensation seeking--reported greater multi-tasking behavior. Finally, the findings suggest that people often engage in multi-tasking because they are less able to block out distractions and focus on a singular task. Participants with less executive control--low scorers on the Operation Span task and persons high in impulsivity--tended to report higher levels of multi-tasking activity.

  10. Who Multi-Tasks and Why? Multi-Tasking Ability, Perceived Multi-Tasking Ability, Impulsivity, and Sensation Seeking

    Science.gov (United States)

    Sanbonmatsu, David M.; Strayer, David L.; Medeiros-Ward, Nathan; Watson, Jason M.

    2013-01-01

    The present study examined the relationship between personality and individual differences in multi-tasking ability. Participants enrolled at the University of Utah completed measures of multi-tasking activity, perceived multi-tasking ability, impulsivity, and sensation seeking. In addition, they performed the Operation Span in order to assess their executive control and actual multi-tasking ability. The findings indicate that the persons who are most capable of multi-tasking effectively are not the persons who are most likely to engage in multiple tasks simultaneously. To the contrary, multi-tasking activity as measured by the Media Multitasking Inventory and self-reported cell phone usage while driving were negatively correlated with actual multi-tasking ability. Multi-tasking was positively correlated with participants’ perceived ability to multi-task ability which was found to be significantly inflated. Participants with a strong approach orientation and a weak avoidance orientation – high levels of impulsivity and sensation seeking – reported greater multi-tasking behavior. Finally, the findings suggest that people often engage in multi-tasking because they are less able to block out distractions and focus on a singular task. Participants with less executive control - low scorers on the Operation Span task and persons high in impulsivity - tended to report higher levels of multi-tasking activity. PMID:23372720

  11. Task demand, task management, and teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Braarud, Per Oeivind; Brendryen, Haavar

    2001-03-15

    The current approach to mental workload assessment in process control was evaluated in 3 previous HAMMLAB studies, by analysing the relationship between workload related measures and performance. The results showed that subjective task complexity rating was related to team's control room performance, that mental effort (NASA-TLX) was weakly related to performance, and that overall activity level was unrelated to performance. The results support the argument that general cognitive measures, i.e., mental workload, are weakly related to performance in the process control domain. This implies that other workload concepts than general mental workload are needed for valid assessment of human reliability and for valid assessment of control room configurations. An assessment of task load in process control suggested that how effort is used to handle task demand is more important then the level of effort invested to solve the task. The report suggests two main workload related concepts with a potential as performance predictors in process control: task requirements, and the work style describing how effort is invested to solve the task. The task requirements are seen as composed of individual task demand and team demand. In a similar way work style are seen as composed of individual task management and teamwork style. A framework for the development of the concepts is suggested based on a literature review and experiences from HAMMLAB research. It is suggested that operational definitions of workload concepts should be based on observable control room behaviour, to assure a potential for developing performance-shaping factors. Finally an explorative analysis of teamwork measures and performance in one study indicated that teamwork concepts are related to performance. This lends support to the suggested development of team demand and teamwork style as elements of a framework for the analysis of workload in process control. (Author)

  12. Computerization of the standard corsi block-tapping task affects its underlying cognitive concepts : A pilot study

    NARCIS (Netherlands)

    Claessen, Michiel H G; Van Der Ham, Ineke J M; Van Zandvoort, Martine J E

    2015-01-01

    The tablet computer initiates an important step toward computerized administration of neuropsychological tests. Because of its lack of standardization, the Corsi Block-Tapping Task could benefit from advantages inherent to computerization. This task, which requires reproduction of a sequence of

  13. Computerization of the Standard Corsi Block-Tapping Task Affects Its Underlying Cognitive Concepts : A Pilot Study

    NARCIS (Netherlands)

    Claessen, Michiel H G; Van Der Ham, Ineke J M; Van Zandvoort, Martine J E

    2014-01-01

    The tablet computer initiates an important step toward computerized administration of neuropsychological tests. Because of its lack of standardization, the Corsi Block-Tapping Task could benefit from advantages inherent to computerization. This task, which requires reproduction of a sequence of

  14. Fundamentals of natural computing basic concepts, algorithms, and applications

    CERN Document Server

    de Castro, Leandro Nunes

    2006-01-01

    Introduction A Small Sample of Ideas The Philosophy of Natural Computing The Three Branches: A Brief Overview When to Use Natural Computing Approaches Conceptualization General Concepts PART I - COMPUTING INSPIRED BY NATURE Evolutionary Computing Problem Solving as a Search Task Hill Climbing and Simulated Annealing Evolutionary Biology Evolutionary Computing The Other Main Evolutionary Algorithms From Evolutionary Biology to Computing Scope of Evolutionary Computing Neurocomputing The Nervous System Artif

  15. Capacity of small groups of muscles to accomplish precision grasping tasks.

    Science.gov (United States)

    Towles, Joseph D; Valero-Cuevas, Francisco J; Hentz, Vincent R

    2013-01-01

    An understanding of the capacity or ability of various muscle groups to generate endpoint forces that enable grasping tasks could provide a stronger biomechanical basis for the design of reconstructive surgery or rehabilitation for the treatment of the paralyzed or paretic hand. We quantified two-dimensional endpoint force distributions for every combination of the muscles of the index finger, in cadaveric specimens, to understand the capability of muscle groups to produce endpoint forces that accomplish three common types of grasps-tripod, tip and lateral pinch-characterized by a representative level of Coulomb friction. We found that muscle groups of 4 or fewer muscles were capable of generating endpoint forces that enabled performance of each of the grasping tasks examined. We also found that flexor muscles were crucial to accomplish tripod pinch; intrinsic muscles, tip pinch; and the dorsal interosseus muscle, lateral pinch. The results of this study provide a basis for decision making in the design of reconstructive surgeries and rehabilitation approaches that attempt to restore the ability to perform grasping tasks with small groups of muscles.

  16. Storyboarding: A Method for Bootstrapping the Design of Computer-Based Educational Tasks

    Science.gov (United States)

    Jones, Ian

    2008-01-01

    There has been a recent call for the use of more systematic thought experiments when investigating learning. This paper presents a storyboarding method for capturing and sharing initial ideas and their evolution in the design of a mathematics learning task. The storyboards produced can be considered as "virtual data" created by thought experiments…

  17. New Computational Model Based on Finite Element Method to Quantify Damage Evolution Due to External Sulfate Attack on Self-Compacting Concretes

    KAUST Repository

    Khelifa, Mohammed Rissel

    2012-12-27

    Abstract: This work combines experimental and numerical investigations to study the mechanical degradation of self-compacting concrete under accelerated aging conditions. Four different experimental treatments are tested among them constant immersion and immersion-drying protocols allow an efficient external sulfate attack of the material. Significant damage is observed due to interfacial ettringite. A predictive analysis is then adopted to quantify the relationship between ettringite growth and mechanical damage evolution during aging. Typical 3D microstructures representing the cement paste-aggregate structures are generated using Monte Carlo scheme. These images are converted into a finite element model to predict the mechanical performance under different criteria of damage kinetics. The effect of ettringite is then associated to the development of an interphase of lower mechanical properties. Our results show that the observed time evolution of Young\\'s modulus is best described by a linear increase of the interphase content. Our model results indicate also that the interphase regions grow at maximum stress regions rather than exclusively at interfaces. Finally, constant immersion predicts a rate of damage growth five times lower than that of immersion-drying protocol. © 2012 Computer-Aided Civil and Infrastructure Engineering.

  18. New Computational Model Based on Finite Element Method to Quantify Damage Evolution Due to External Sulfate Attack on Self-Compacting Concretes

    KAUST Repository

    Khelifa, Mohammed Rissel; Guessasma, Sofiane

    2012-01-01

    Abstract: This work combines experimental and numerical investigations to study the mechanical degradation of self-compacting concrete under accelerated aging conditions. Four different experimental treatments are tested among them constant immersion and immersion-drying protocols allow an efficient external sulfate attack of the material. Significant damage is observed due to interfacial ettringite. A predictive analysis is then adopted to quantify the relationship between ettringite growth and mechanical damage evolution during aging. Typical 3D microstructures representing the cement paste-aggregate structures are generated using Monte Carlo scheme. These images are converted into a finite element model to predict the mechanical performance under different criteria of damage kinetics. The effect of ettringite is then associated to the development of an interphase of lower mechanical properties. Our results show that the observed time evolution of Young's modulus is best described by a linear increase of the interphase content. Our model results indicate also that the interphase regions grow at maximum stress regions rather than exclusively at interfaces. Finally, constant immersion predicts a rate of damage growth five times lower than that of immersion-drying protocol. © 2012 Computer-Aided Civil and Infrastructure Engineering.

  19. Automation of Educational Tasks for Academic Radiology.

    Science.gov (United States)

    Lamar, David L; Richardson, Michael L; Carlson, Blake

    2016-07-01

    The process of education involves a variety of repetitious tasks. We believe that appropriate computer tools can automate many of these chores, and allow both educators and their students to devote a lot more of their time to actual teaching and learning. This paper details tools that we have used to automate a broad range of academic radiology-specific tasks on Mac OS X, iOS, and Windows platforms. Some of the tools we describe here require little expertise or time to use; others require some basic knowledge of computer programming. We used TextExpander (Mac, iOS) and AutoHotKey (Win) for automated generation of text files, such as resident performance reviews and radiology interpretations. Custom statistical calculations were performed using TextExpander and the Python programming language. A workflow for automated note-taking was developed using Evernote (Mac, iOS, Win) and Hazel (Mac). Automated resident procedure logging was accomplished using Editorial (iOS) and Python. We created three variants of a teaching session logger using Drafts (iOS) and Pythonista (iOS). Editorial and Drafts were used to create flashcards for knowledge review. We developed a mobile reference management system for iOS using Editorial. We used the Workflow app (iOS) to automatically generate a text message reminder for daily conferences. Finally, we developed two separate automated workflows-one with Evernote (Mac, iOS, Win) and one with Python (Mac, Win)-that generate simple automated teaching file collections. We have beta-tested these workflows, techniques, and scripts on several of our fellow radiologists. All of them expressed enthusiasm for these tools and were able to use one or more of them to automate their own educational activities. Appropriate computer tools can automate many educational tasks, and thereby allow both educators and their students to devote a lot more of their time to actual teaching and learning. Copyright © 2016 The Association of University Radiologists

  20. Task Group on Computer/Communication Protocols for Bibliographic Data Exchange. Interim Report = Groupe de Travail sur les Protocoles de Communication/Ordinateurs pour l'Exchange de Donnees Bibliographiques. Rapport d'Etape. May 1983.

    Science.gov (United States)

    Canadian Network Papers, 1983

    1983-01-01

    This preliminary report describes the work to date of the Task Group on Computer/Communication protocols for Bibliographic Data Interchange, which was formed in 1980 to develop a set of protocol standards to facilitate communication between heterogeneous library and information systems within the framework of Open Systems Interconnection (OSI). A…