WorldWideScience

Sample records for computer tasks quantified

  1. Eye blink frequency during different computer tasks quantified by electrooculography

    DEFF Research Database (Denmark)

    Skotte, J H; Nøjgaard, J K; Jørgensen, L V

    2007-01-01

    The purpose of the study was to evaluate electrooculography (EOG) as an automatic method to measure the human eye blink frequency (BF) during passive and interactive computer tasks performed at two screen heights. Ten healthy subjects (5 males and 5 females) participated in the study in a 23...... degrees C temperature and 30-35% relative humidity controlled simulated office environment. Each test subject completed a 2 x 10 min active task of computer work and a 3 x 10 min passive task of watching a film on a video display unit (VDU). Both tasks included two viewing angles: standard (the monitors...... counted manually from the video recordings and compared to the EOG measurements. The method showed a high validity to detect blinks during computer work: 95.4% of the blinks were retrieved by the EOG method and very few artefacts from eye movements were erroneously classified as eye blinks (2.4%). By use...

  2. LHCb computing tasks

    CERN Document Server

    Binko, P

    1998-01-01

    This document describes the computing tasks of the LHCb computing system. It also describes the logistics of the dataflow between the tasks and the detailed requirements for each task, in particular the data sizes and CPU power requirements. All data sizes are calculated assuming that the LHCb experiment will take data about 107 s per year at a frequency of 200 Hz, which gives 2 \\Theta 109 real events per year. The raw event size should not exceed 100 kB (200 TB per year). We will have to generate about 109 MonteCarlo events per year. The current MonteCarlo simulation program based on the GEANT3.21 package requires about 12 s to produce an average event (all CPU times are normalised to a 1000 MIPS processor). The size of an average MonteCarlo event will be about 200 kB (100 TB per year) of simulated data (without the hits). We will start to use the GEANT4 package in 1998. Rejection factors of 8 and 25 are required in the Level-2 and Level-3 triggers respectively, to reduce the frequency of events to 200 Hz. T...

  3. Quantifying tasks and roles in insect societies

    African Journals Online (AJOL)

    1991-05-15

    May 15, 1991 ... ergonomic selection, and was associated with the evolution of increasing behavioural ... The sequence in which tasks are perfonned by workers, .... space, providing a pictorial representation of the association between the ...

  4. Computer-Related Task Performance

    DEFF Research Database (Denmark)

    Longstreet, Phil; Xiao, Xiao; Sarker, Saonee

    2016-01-01

    The existing information system (IS) literature has acknowledged computer self-efficacy (CSE) as an important factor contributing to enhancements in computer-related task performance. However, the empirical results of CSE on performance have not always been consistent, and increasing an individual......'s CSE is often a cumbersome process. Thus, we introduce the theoretical concept of self-prophecy (SP) and examine how this social influence strategy can be used to improve computer-related task performance. Two experiments are conducted to examine the influence of SP on task performance. Results show...... that SP and CSE interact to influence performance. Implications are then discussed in terms of organizations’ ability to increase performance....

  5. Computer task performance by subjects with Duchenne muscular dystrophy.

    Science.gov (United States)

    Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira

    2016-01-01

    Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.

  6. Model Checking Quantified Computation Tree Logic

    NARCIS (Netherlands)

    Rensink, Arend; Baier, C; Hermanns, H.

    2006-01-01

    Propositional temporal logic is not suitable for expressing properties on the evolution of dynamically allocated entities over time. In particular, it is not possible to trace such entities through computation steps, since this requires the ability to freely mix quantification and temporal

  7. Computational tasks in robotics and factory automation

    NARCIS (Netherlands)

    Biemans, Frank P.; Vissers, C.A.

    1988-01-01

    The design of Manufacturing Planning and Control Systems (MPCSs) — systems that negotiate with Customers and Suppliers to exchange products in return for money in order to generate profit, is discussed. The computational task of MPCS components are systematically specified as a starting point for

  8. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention

    Directory of Open Access Journals (Sweden)

    Frère Annie F

    2011-08-01

    Full Text Available Abstract Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive computer game based on virtual reality was developed to evaluate the performance of the players. The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2 and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Results Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants

  9. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention.

    Science.gov (United States)

    Silva, Alessandro P; Frère, Annie F

    2011-08-19

    Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. An interactive computer game based on virtual reality was developed to evaluate the performance of the players.The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected

  10. The task-to-task communication between computers

    International Nuclear Information System (INIS)

    Lin Shuzi; Zhang Bingyun; Zhao Weiren

    1992-01-01

    The task-to-task communication is used in the Institute of High Energy Physics. The BES (Beijing Spectrometer) uses the communication mode to take some of the BEPC (Beijing Electron Positron Collider) running parameters needed by BES experiments in a periodic time. The authors describe the principle of transparent task-to-task communication and how to use it in BES on-line data acquisition system

  11. Quantifying the Physiological Stress Response to Simulated Maritime Pilotage Tasks: The Influence of Task Complexity and Pilot Experience.

    Science.gov (United States)

    Main, Luana C; Wolkow, Alexander; Chambers, Timothy P

    2017-11-01

    The aim of this study was to quantify the stress associated with performing maritime pilotage tasks in a high-fidelity simulator. Eight trainee and 13 maritime pilots completed two simulated pilotage tasks of varying complexity. Salivary cortisol samples were collected pre- and post-simulation for both trials. Heart rate was measured continuously throughout the study. Significant changes in salivary cortisol (P = 0.000, η = 0.139), average (P = 0.006, η = 0.087), and peak heart rate (P = 0.013, η = 0.077) from pre- to postsimulation were found. Varying task complexity did partially influence stress response; average (P = 0.016, η = 0.026) and peak heart rate (P = 0.034, η = 0.020) were higher in the experimental condition. Trainees also recorded higher average (P = 0.000, η = 0.054) and peak heart rates (P = 0.027, η = 0.022). Performing simulated pilotage tasks evoked a measurable stress response in both trainee and expert maritime pilots.

  12. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  13. The Composite Strain Index (COSI) and Cumulative Strain Index (CUSI): methodologies for quantifying biomechanical stressors for complex tasks and job rotation using the Revised Strain Index.

    Science.gov (United States)

    Garg, Arun; Moore, J Steven; Kapellusch, Jay M

    2017-08-01

    The Composite Strain Index (COSI) quantifies biomechanical stressors for complex tasks consisting of exertions at different force levels and/or with different exertion times. The Cumulative Strain Index (CUSI) further integrates biomechanical stressors from different tasks to quantify exposure for the entire work shift. The paper provides methodologies to compute COSI and CUSI along with examples. Complex task simulation produced 169,214 distinct tasks. Use of average, time-weighted average (TWA) and peak force and COSI classified 66.9, 28.2, 100 and 38.9% of tasks as hazardous, respectively. For job rotation the simulation produced 10,920 distinct jobs. TWA COSI, peak task COSI and CUSI classified 36.5, 78.1 and 66.6% jobs as hazardous, respectively. The results suggest that the TWA approach systematically underestimates the biomechanical stressors and peak approach overestimates biomechanical stressors, both at the task and job level. It is believed that the COSI and CUSI partially address these underestimations and overestimations of biomechanical stressors. Practitioner Summary: COSI quantifies exposure when applied hand force and/or duration of that force changes during a task cycle. CUSI integrates physical exposures from job rotation. These should be valuable tools for designing and analysing tasks and job rotation to determine risk of musculoskeletal injuries.

  14. Report of the Task Force on Computer Charging.

    Science.gov (United States)

    Computer Co-ordination Group, Ottawa (Ontario).

    The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…

  15. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    Science.gov (United States)

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  16. PARTICAL SWARM OPTIMIZATION OF TASK SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    Payal Jaglan*, Chander Diwakar

    2016-01-01

    Resource provisioning and pricing modeling in cloud computing makes it an inevitable technology both on developer and consumer end. Easy accessibility of software and freedom of hardware configuration increase its demand in IT industry. It’s ability to provide a user-friendly environment, software independence, quality, pricing index and easy accessibility of infrastructure via internet. Task scheduling plays an important role in cloud computing systems. Task scheduling in cloud computing mea...

  17. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  18. Task and Interruption Management in Activity-Centric Computing

    DEFF Research Database (Denmark)

    Jeuris, Steven

    to address these not in isolation, but by fundamentally reevaluating the current computing paradigm. To this end, activity-centric computing has been brought forward as an alternative computing paradigm, addressing the increasing strain put on modern-day computing systems. Activity-centric computing follows...... the scalability and intelligibility of current research prototypes. In this dissertation, I postulate that such issues arise due to a lack of support for the full set of practices which make up activity management. Most notably, although task and interruption management are an integral part of personal...... information management, they have thus far been neglected in prior activity-centric computing systems. Advancing the research agenda of activity-centric computing, I (1) implement and evaluate an activity-centric desktop computing system, incorporating support for interruptions and long-term task management...

  19. Coherence and computational complexity of quantifier-free dependence logic formulas

    NARCIS (Netherlands)

    Kontinen, J.; Kontinen, J.; Väänänen, J.

    2010-01-01

    We study the computational complexity of the model checking for quantifier-free dependence logic (D) formulas. We point out three thresholds in the computational complexity: logarithmic space, non- deterministic logarithmic space and non-deterministic polynomial time.

  20. Mental workload during n-back task - quantified in the prefrontal cortex using fNIRS

    Directory of Open Access Journals (Sweden)

    Christian eHerff

    2014-01-01

    Full Text Available When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g. interacting with the car navigation system while driving it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to illicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user.The prefrontal cortex (PFC plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS, a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3} to induce different levels of workload, forcing subjects to continuously remember the last one, two or three of rapidly changing items.Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload.Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online.

  1. Cloud Computing Task Scheduling Based on Cultural Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Jian-Wen

    2016-01-01

    Full Text Available The task scheduling strategy based on cultural genetic algorithm(CGA is proposed in order to improve the efficiency of task scheduling in the cloud computing platform, which targets at minimizing the total time and cost of task scheduling. The improved genetic algorithm is used to construct the main population space and knowledge space under cultural framework which get independent parallel evolution, forming a mechanism of mutual promotion to dispatch the cloud task. Simultaneously, in order to prevent the defects of the genetic algorithm which is easy to fall into local optimum, the non-uniform mutation operator is introduced to improve the search performance of the algorithm. The experimental results show that CGA reduces the total time and lowers the cost of the scheduling, which is an effective algorithm for the cloud task scheduling.

  2. Optimal usage of computing grid network in the fields of nuclear fusion computing task

    International Nuclear Information System (INIS)

    Tenev, D.

    2006-01-01

    Nowadays the nuclear power becomes the main source of energy. To make its usage more efficient, the scientists created complicated simulation models, which require powerful computers. The grid computing is the answer to powerful and accessible computing resources. The article observes, and estimates the optimal configuration of the grid environment in the fields of the complicated nuclear fusion computing tasks. (author)

  3. A Computational Approach to Quantifiers as an Explanation for Some Language Impairments in Schizophrenia

    Science.gov (United States)

    Zajenkowski, Marcin; Styla, Rafal; Szymanik, Jakub

    2011-01-01

    We compared the processing of natural language quantifiers in a group of patients with schizophrenia and a healthy control group. In both groups, the difficulty of the quantifiers was consistent with computational predictions, and patients with schizophrenia took more time to solve the problems. However, they were significantly less accurate only…

  4. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    Science.gov (United States)

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  5. Quantifying the debonding of inclusions through tomography and computational homology.

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Wei-Yang; Johnson, George C. (University of California, Berkeley, Berkeley, CA); Mota, Alejandro; Foulk, James W., III; Jin, Huiqing

    2010-09-01

    This report describes a Laboratory Directed Research and Development (LDRD) project to use of synchrotron-radiation computed tomography (SRCT) data to determine the conditions and mechanisms that lead to void nucleation in rolled alloys. The Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory (LBNL) has provided SRCT data of a few specimens of 7075-T7351 aluminum plate (widely used for aerospace applications) stretched to failure, loaded in directions perpendicular and parallel to the rolling direction. The resolution of SRCT data is 900nm, which allows elucidation of the mechanisms governing void growth and coalescence. This resolution is not fine enough, however, for nucleation. We propose the use statistics and image processing techniques to obtain sub-resolution scale information from these data, and thus determine where in the specimen and when during the loading program nucleation occurs and the mechanisms that lead to it. Quantitative analysis of the tomography data, however, leads to the conclusion that the reconstruction process compromises the information obtained from the scans. Alternate, more powerful reconstruction algorithms are needed to address this problem, but those fall beyond the scope of this project.

  6. Task conflict and proactive control: A computational theory of the Stroop task.

    Science.gov (United States)

    Kalanthroff, Eyal; Davelaar, Eddy J; Henik, Avishai; Goldfarb, Liat; Usher, Marius

    2018-01-01

    The Stroop task is a central experimental paradigm used to probe cognitive control by measuring the ability of participants to selectively attend to task-relevant information and inhibit automatic task-irrelevant responses. Research has revealed variability in both experimental manipulations and individual differences. Here, we focus on a particular source of Stroop variability, the reverse-facilitation (RF; faster responses to nonword neutral stimuli than to congruent stimuli), which has recently been suggested as a signature of task conflict. We first review the literature that shows RF variability in the Stroop task, both with regard to experimental manipulations and to individual differences. We suggest that task conflict variability can be understood as resulting from the degree of proactive control that subjects recruit in advance of the Stroop stimulus. When the proactive control is high, task conflict does not arise (or is resolved very quickly), resulting in regular Stroop facilitation. When proactive control is low, task conflict emerges, leading to a slow-down in congruent and incongruent (but not in neutral) trials and thus to Stroop RF. To support this suggestion, we present a computational model of the Stroop task, which includes the resolution of task conflict and its modulation by proactive control. Results show that our model (a) accounts for the variability in Stroop-RF reported in the experimental literature, and (b) solves a challenge to previous Stroop models-their ability to account for reaction time distributional properties. Finally, we discuss theoretical implications to Stroop measures and control deficits observed in some psychopathologies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Sort-Mid tasks scheduling algorithm in grid computing

    Directory of Open Access Journals (Sweden)

    Naglaa M. Reda

    2015-11-01

    Full Text Available Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.

  8. Sort-Mid tasks scheduling algorithm in grid computing.

    Science.gov (United States)

    Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M

    2015-11-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.

  9. Quantified measurement of brain blood volume: comparative evaluations between the single photon emission computer tomography and the positron computer tomography

    International Nuclear Information System (INIS)

    Bouvard, G.; Fernandez, Y.; Petit-Taboue, M.C.; Derlon, J.M.; Travere, J.M.; Le Poec, C.

    1991-01-01

    The quantified measurement of cerebral blood volume is interesting for the brain blood circulation studies. This measurement is often used in positron computed tomography. It's more difficult in single photon emission computed tomography: there are physical problems with the limited resolution of the detector, the Compton effect and the photon attenuation. The objectif of this study is to compare the results between these two techniques. The quantified measurement of brain blood volume is possible with the single photon emission computer tomogragry. However, there is a loss of contrast [fr

  10. Sympathetic nervous system activity measured by skin conductance quantifies the challenge of walking adaptability tasks after stroke.

    Science.gov (United States)

    Clark, David J; Chatterjee, Sudeshna A; McGuirk, Theresa E; Porges, Eric C; Fox, Emily J; Balasubramanian, Chitralakshmi K

    2018-02-01

    Walking adaptability tasks are challenging for people with motor impairments. The construct of perceived challenge is typically measured by self-report assessments, which are susceptible to subjective measurement error. The development of an objective physiologically-based measure of challenge may help to improve the ability to assess this important aspect of mobility function. The objective of this study to investigate the use of sympathetic nervous system (SNS) activity measured by skin conductance to gauge the physiological stress response to challenging walking adaptability tasks in people post-stroke. Thirty adults with chronic post-stroke hemiparesis performed a battery of seventeen walking adaptability tasks. SNS activity was measured by skin conductance from the palmar surface of each hand. The primary outcome variable was the percent change in skin conductance level (ΔSCL) between the baseline resting and walking phases of each task. Task difficulty was measured by performance speed and by physical therapist scoring of performance. Walking function and balance confidence were measured by preferred walking speed and the Activities-specific Balance Confidence Scale, respectively. There was a statistically significant negative association between ΔSCL and task performance speed and between ΔSCL and clinical score, indicating that tasks with greater SNS activity had slower performance speed and poorer clinical scores. ΔSCL was significantly greater for low functioning participants versus high functioning participants, particularly during the most challenging walking adaptability tasks. This study supports the use of SNS activity measured by skin conductance as a valuable approach for objectively quantifying the perceived challenge of walking adaptability tasks in people post-stroke. Published by Elsevier B.V.

  11. The ATB Framework : quantifying and Classifying Epistemic Strategies in Tangible Problem-Solving Tasks

    NARCIS (Netherlands)

    Esteves, A.E.; Bakker, S.; Antle, A.N. (Alissa); May, A.; Warren, J.; Oakley, I.

    2015-01-01

    In task performance, pragmatic actions refer to behaviors that make direct progress, while epistemic actions involve altering the world so that cognitive processes are faster, more reliable or less taxing. Epistemic actions are frequently presented as a beneficial consequence of interacting with

  12. APPLICATION OF COMPUTER-AIDED TOMOGRAPHY TO VISUALIZE AND QUANTIFY BIOGENIC STRUCTURES IN MARINE SEDIMENTS

    Science.gov (United States)

    We used computer-aided tomography (CT) for 3D visualization and 2D analysis ofmarine sediment cores from 3 stations (at 10, 75 and 118 m depths) with different environmentalimpact. Biogenic structures such as tubes and burrows were quantified and compared among st...

  13. Quantifying cross-linguistic influence with a computational model : A study of case-marking comprehension

    NARCIS (Netherlands)

    Matusevych, Yevgen; Alishahi, Afra; Backus, Albert

    2017-01-01

    Cross-linguistic influence (CLI) is one of the key phenomena in bilingual and second language learning. We propose a method for quantifying CLI in the use of linguistic constructions with the help of a computational model, which acquires constructions in two languages from bilingual input. We focus

  14. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention

    OpenAIRE

    Frère Annie F; Silva Alessandro P

    2011-01-01

    Abstract Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive c...

  15. Classifying and quantifying human error in routine tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Pederson, O.M.; Rasmussen, J.; Carnino, A.; Gagnolet, P.; Griffon, M.; Mancini, G.

    1982-01-01

    This paper results from the work of the OECD/NEA-CSNI Group of Experts on Human Error Data and Assessment. It proposes a classification system (or taxonomy) for use in reporting events involving human malfunction, especially those occurring during the execution of routine tasks. A set of data collection sheets based on this taxonomy has been designed. They include the information needed in order to ensure adequate quality and coherence of the raw data. The sources from which the various data should be obtainable are identified, as are the persons who should analyze them. Improving data collection systems is an iterative process. Therefore Group members are currently making trial applications of the taxonomy to previously analysed real incidents. Results from the initial round of trials are presented and discussed

  16. Using frequency tagging to quantify attentional deployment in a visual divided attention task.

    Science.gov (United States)

    Toffanin, Paolo; de Jong, Ritske; Johnson, Addie; Martens, Sander

    2009-06-01

    Frequency tagging is an EEG method based on the quantification of the steady state visual evoked potential (SSVEP) elicited from stimuli which flicker with a distinctive frequency. Because the amplitude of the SSVEP is modulated by attention such that attended stimuli elicit higher SSVEP amplitudes than do ignored stimuli, the method has been used to investigate the neural mechanisms of spatial attention. However, up to now it has not been shown whether the amplitude of the SSVEP is sensitive to gradations of attention and there has been debate about whether attention effects on the SSVEP are dependent on the tagging frequency used. We thus compared attention effects on SSVEP across three attention conditions-focused, divided, and ignored-with six different tagging frequencies. Participants performed a visual detection task (respond to the digit 5 embedded in a stream of characters). Two stimulus streams, one to the left and one to the right of fixation, were displayed simultaneously, each with a background grey square whose hue was sine-modulated with one of the six tagging frequencies. At the beginning of each trial a cue indicated whether targets on the left, right, or both sides should be responded to. Accuracy was higher in the focused- than in the divided-attention condition. SSVEP amplitudes were greatest in the focused-attention condition, intermediate in the divided-attention condition, and smallest in the ignored-attention condition. The effect of attention on SSVEP amplitude did not depend on the tagging frequency used. Frequency tagging appears to be a flexible technique for studying attention.

  17. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    Science.gov (United States)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  18. Correlations between Motor Symptoms across Different Motor Tasks, Quantified via Random Forest Feature Classification in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Andreas Kuhner

    2017-11-01

    Full Text Available BackgroundObjective assessments of Parkinson’s disease (PD patients’ motor state using motion capture techniques are still rarely used in clinical practice, even though they may improve clinical management. One major obstacle relates to the large dimensionality of motor abnormalities in PD. We aimed to extract global motor performance measures covering different everyday motor tasks, as a function of a clinical intervention, i.e., deep brain stimulation (DBS of the subthalamic nucleus.MethodsWe followed a data-driven, machine-learning approach and propose performance measures that employ Random Forests with probability distributions. We applied this method to 14 PD patients with DBS switched-off or -on, and 26 healthy control subjects performing the Timed Up and Go Test (TUG, the Functional Reach Test (FRT, a hand coordination task, walking 10-m straight, and a 90° curve.ResultsFor each motor task, a Random Forest identified a specific set of metrics that optimally separated PD off DBS from healthy subjects. We noted the highest accuracy (94.6% for standing up. This corresponded to a sensitivity of 91.5% to detect a PD patient off DBS, and a specificity of 97.2% representing the rate of correctly identified healthy subjects. We then calculated performance measures based on these sets of metrics and applied those results to characterize symptom severity in different motor tasks. Task-specific symptom severity measures correlated significantly with each other and with the Unified Parkinson’s Disease Rating Scale (UPDRS, part III, correlation of r2 = 0.79. Agreement rates between different measures ranged from 79.8 to 89.3%.ConclusionThe close correlation of PD patients’ various motor abnormalities quantified by different, task-specific severity measures suggests that these abnormalities are only facets of the underlying one-dimensional severity of motor deficits. The identification and characterization of this underlying motor deficit

  19. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    OpenAIRE

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s mul...

  20. Mental workload during n-back task-quantified in the prefrontal cortex using fNIRS.

    Science.gov (United States)

    Herff, Christian; Heger, Dominic; Fortmann, Ole; Hennrich, Johannes; Putze, Felix; Schultz, Tanja

    2013-01-01

    When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g., interacting with the car navigation system while driving) it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs) are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to elicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user. The prefrontal cortex (PFC) plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS), a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3}) to induce different levels of workload, forcing subjects to continuously remember the last one, two, or three of rapidly changing items. Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload. Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online.

  1. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  2. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  3. A computational model of the fetal circulation to quantify blood redistribution in intrauterine growth restriction.

    Directory of Open Access Journals (Sweden)

    Patricia Garcia-Canadilla

    2014-06-01

    Full Text Available Intrauterine growth restriction (IUGR due to placental insufficiency is associated with blood flow redistribution in order to maintain delivery of oxygenated blood to the brain. Given that, in the fetus the aortic isthmus (AoI is a key arterial connection between the cerebral and placental circulations, quantifying AoI blood flow has been proposed to assess this brain sparing effect in clinical practice. While numerous clinical studies have studied this parameter, fundamental understanding of its determinant factors and its quantitative relation with other aspects of haemodynamic remodeling has been limited. Computational models of the cardiovascular circulation have been proposed for exactly this purpose since they allow both for studying the contributions from isolated parameters as well as estimating properties that cannot be directly assessed from clinical measurements. Therefore, a computational model of the fetal circulation was developed, including the key elements related to fetal blood redistribution and using measured cardiac outflow profiles to allow personalization. The model was first calibrated using patient-specific Doppler data from a healthy fetus. Next, in order to understand the contributions of the main parameters determining blood redistribution, AoI and middle cerebral artery (MCA flow changes were studied by variation of cerebral and peripheral-placental resistances. Finally, to study how this affects an individual fetus, the model was fitted to three IUGR cases with different degrees of severity. In conclusion, the proposed computational model provides a good approximation to assess blood flow changes in the fetal circulation. The results support that while MCA flow is mainly determined by a fall in brain resistance, the AoI is influenced by a balance between increased peripheral-placental and decreased cerebral resistances. Personalizing the model allows for quantifying the balance between cerebral and peripheral

  4. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    Science.gov (United States)

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  5. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  6. Quantifying the visual appearance of sunscreens applied to the skin using indirect computer image colorimetry.

    Science.gov (United States)

    Richer, Vincent; Kharazmi, Pegah; Lee, Tim K; Kalia, Sunil; Lui, Harvey

    2018-03-01

    There is no accepted method to objectively assess the visual appearance of sunscreens on the skin. We present a method for sunscreen application, digital photography, and computer analysis to quantify the appearance of the skin after sunscreen application. Four sunscreen lotions were applied randomly at densities of 0.5, 1.0, 1.5, and 2.0 mg/cm 2 to areas of the back of 29 subjects. Each application site had a matched contralateral control area. High-resolution standardized photographs including a color card were taken after sunscreen application. After color balance correction, CIE L*a*b* color values were extracted from paired sites. Differences in skin appearance attributed to sunscreen were represented by ΔE, which in turn was calculated from the linear Euclidean distance within the L*a*b* color space between the paired sites. Sunscreen visibility as measured by median ΔE varied across different products and application densities and ranged between 1.2 and 12.1. The visibility of sunscreens varied according to product SPF, composition (organic vs inorganic), presence of tint, and baseline b* of skin (P colorimetry represents a potential method to objectively quantify visibility of sunscreen on the skin. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  8. Computer-mediated communication: task performance and satisfaction.

    Science.gov (United States)

    Simon, Andrew F

    2006-06-01

    The author assessed satisfaction and performance on 3 tasks (idea generation, intellective, judgment) among 75 dyads (N = 150) working through 1 of 3 modes of communication (instant messaging, videoconferencing, face to face). The author based predictions on the Media Naturalness Theory (N. Kock, 2001, 2002) and on findings from past researchers (e.g., D. M. DeRosa, C. Smith, & D. A. Hantula, in press) of the interaction between tasks and media. The present author did not identify task performance differences, although satisfaction with the medium was lower among those dyads communicating through an instant-messaging system than among those interacting face to face or through videoconferencing. The findings support the Media Naturalness Theory. The author discussed them in relation to the participants' frequent use of instant messaging and their familiarity with new communication media.

  9. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  10. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  11. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  12. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  13. The ability of non-computer tasks to increase biomechanical exposure variability in computer-intensive office work.

    Science.gov (United States)

    Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Nogueira, Helen Cristina; Oliveira, Ana Beatriz

    2015-01-01

    Postures and muscle activity in the upper body were recorded from 50 academics office workers during 2 hours of normal work, categorised by observation into computer work (CW) and three non-computer (NC) tasks (NC seated work, NC standing/walking work and breaks). NC tasks differed significantly in exposures from CW, with standing/walking NC tasks representing the largest contrasts for most of the exposure variables. For the majority of workers, exposure variability was larger in their present job than in CW alone, as measured by the job variance ratio (JVR), i.e. the ratio between min-min variabilities in the job and in CW. Calculations of JVRs for simulated jobs containing different proportions of CW showed that variability could, indeed, be increased by redistributing available tasks, but that substantial increases could only be achieved by introducing more vigorous tasks in the job, in casu illustrated by cleaning.

  14. A comparative study of 2 computer-assisted methods of quantifying brightfield microscopy images.

    Science.gov (United States)

    Tse, George H; Marson, Lorna P

    2013-10-01

    Immunohistochemistry continues to be a powerful tool for the detection of antigens. There are several commercially available software packages that allow image analysis; however, these can be complex, require relatively high level of computer skills, and can be expensive. We compared 2 commonly available software packages, Adobe Photoshop CS6 and ImageJ, in their ability to quantify percentage positive area after picrosirius red (PSR) staining and 3,3'-diaminobenzidine (DAB) staining. On analysis of DAB-stained B cells in the mouse spleen, with a biotinylated primary rat anti-mouse-B220 antibody, there was no significant difference on converting images from brightfield microscopy to binary images to measure black and white pixels using ImageJ compared with measuring a range of brown pixels with Photoshop (Student t test, P=0.243, correlation r=0.985). When analyzing mouse kidney allografts stained with PSR, Photoshop achieved a greater interquartile range while maintaining a lower 10th percentile value compared with analysis with ImageJ. A lower 10% percentile reflects that Photoshop analysis is better at analyzing tissues with low levels of positive pixels; particularly relevant for control tissues or negative controls, whereas after ImageJ analysis the same images would result in spuriously high levels of positivity. Furthermore comparing the 2 methods by Bland-Altman plot revealed that these 2 methodologies did not agree when measuring images with a higher percentage of positive staining and correlation was poor (r=0.804). We conclude that for computer-assisted analysis of images of DAB-stained tissue there is no difference between using Photoshop or ImageJ. However, for analysis of color images where differentiation into a binary pattern is not easy, such as with PSR, Photoshop is superior at identifying higher levels of positivity while maintaining differentiation of low levels of positive staining.

  15. Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.

    Science.gov (United States)

    Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge

    2015-01-01

    Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.

  16. Quantifying multiscale porosity and fracture aperture distribution in granite cores using computed tomography

    Science.gov (United States)

    Wenning, Quinn; Madonna, Claudio; Joss, Lisa; Pini, Ronny

    2017-04-01

    Knowledge of porosity and fracture (aperture) distribution is key towards a sound description of fluid transport in low-permeability rocks. In the context of geothermal energy development, the ability to quantify the transport properties of fractures is needed to in turn quantify the rate of heat transfer, and, accordingly, to optimize the engineering design of the operation. In this context, core-flooding experiments coupled with non-invasive imaging techniques (e.g., X-Ray Computed Tomography - X-Ray CT) represent a powerful tool for making direct observations of these properties under representative geologic conditions. This study focuses on quantifying porosity and fracture aperture distribution in a fractured westerly granite core by using two recently developed experimental protocols. The latter include the use of a highly attenuating gas [Vega et al., 2014] and the application of the so-called missing CT attenuation method [Huo et al., 2016] to produce multidimensional maps of the pore space and of the fractures. Prior to the imaging experiments, the westerly granite core (diameter: 5 cm, length: 10 cm) was thermally shocked to induce micro-fractured pore space; this was followed by the application of the so-called Brazilian method to induce a macroscopic fracture along the length of the core. The sample was then mounted in a high-pressure aluminum core-holder, exposed to a confining pressure and placed inside a medical CT scanner for imaging. An initial compressive pressure cycle was performed to remove weak asperities and reduce the hysteretic behavior of the fracture with respect to effective pressure. The CT scans were acquired at room temperature and 0.5, 5, 7, and 10 MPa effective pressure under loading and unloading conditions. During scanning the pore fluid pressure was undrained and constant, and the confining pressure was regulated at the desired pressure with a high precision pump. Highly transmissible krypton and helium gases were used as

  17. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    International Nuclear Information System (INIS)

    Dolly, S; Mutic, S; Anastasio, M; Li, H; Yu, L

    2016-01-01

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework was developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation

  18. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  19. Effects of four types of non-obtrusive feedback on computer behaviour, task performance and comfort

    NARCIS (Netherlands)

    Korte, E.M.; Huijsmans, M.A.; de Jong, A.M.; van de Ven, J.G.M.; Ruijsendaal, M.

    2012-01-01

    This study investigated the effects of non-obtrusive feedback on continuous lifted hand/finger behaviour, task performance and comfort. In an experiment with 24 participants the effects of two visual and two tactile feedback signals were compared to a no-feedback condition in a computer task.

  20. Research program in computational physics: [Progress report for Task D

    International Nuclear Information System (INIS)

    Guralnik, G.S.

    1987-01-01

    Studies are reported of several aspects of the purely gluonic sector of QCD, including methods for efficiently generating gauge configurations, properties of the standard Wilson action and improved actions, and properties of the pure glue theory itself. Simulation of quantum chromodynamics in the ''quenched approximation'', in which the back reaction of quarks upon gauge fields is neglected, is studied with fermions introduced on the lattice via both Wilson and staggered formulations. Efforts are also reported to compute QCD matrix elements and to simulate QCD theory beyond the quenched approximation considering the effect of the quarks on the gauge fields. Work is in progress toward improving the algorithms used to generate the gauge field configurations and to compute the quark propagators. Implementation of lattice QCD on a hypercube is also reported

  1. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  2. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    Science.gov (United States)

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Quantifying narrative ability in autism spectrum disorder: a computational linguistic analysis of narrative coherence.

    Science.gov (United States)

    Losh, Molly; Gordon, Peter C

    2014-12-01

    Autism is a neurodevelopmental disorder characterized by serious difficulties with the social use of language, along with impaired social functioning and ritualistic/repetitive behaviors (American Psychiatric Association in Diagnostic and statistical manual of mental disorders: DSM-5, 5th edn. American Psychiatric Association, Arlington, 2013). While substantial heterogeneity exists in symptom expression, impairments in language discourse skills, including narrative (or storytelling), are universally observed in autism (Tager-Flusberg et al. in Handbook on autism and pervasive developmental disorders, 3rd edn. Wiley, New York, pp 335-364, 2005). This study applied a computational linguistic tool, Latent Semantic Analysis (LSA), to objectively characterize narrative performance in high-functioning individuals with autism and typically-developing controls, across two different narrative contexts that differ in the interpersonal and cognitive demands placed on the narrator. Results indicated that high-functioning individuals with autism produced narratives comparable in semantic content to those produced by controls when narrating from a picture book, but produced narratives diminished in semantic quality in a more demanding narrative recall task. This pattern is similar to that detected from analyses of hand-coded picture book narratives in prior research, and extends findings to an additional narrative context that proves particularly challenging for individuals with autism. Results are discussed in terms of the utility of LSA as a quantitative, objective, and efficient measure of narrative ability.

  4. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  5. Effect of aging on performance, muscle activation and perceived stress during mentally demanding computer tasks

    DEFF Research Database (Denmark)

    Alkjaer, Tine; Pilegaard, Marianne; Bakke, Merete

    2005-01-01

    OBJECTIVES: This study examined the effects of age on performance, muscle activation, and perceived stress during computer tasks with different levels of mental demand. METHODS: Fifteen young and thirteen elderly women performed two computer tasks [color word test and reference task] with different...... levels of mental demand but similar physical demands. The performance (clicking frequency, percentage of correct answers, and response time for correct answers) and electromyography from the forearm, shoulder, and neck muscles were recorded. Visual analogue scales were used to measure the participants......' perception of the stress and difficulty related to the tasks. RESULTS: Performance decreased significantly in both groups during the color word test in comparison with performance on the reference task. However, the performance reduction was more pronounced in the elderly group than in the young group...

  6. Pixel-Level Deep Segmentation: Artificial Intelligence Quantifies Muscle on Computed Tomography for Body Morphometric Analysis.

    Science.gov (United States)

    Lee, Hyunkwang; Troschel, Fabian M; Tajmir, Shahein; Fuchs, Georg; Mario, Julia; Fintelmann, Florian J; Do, Synho

    2017-08-01

    Pretreatment risk stratification is key for personalized medicine. While many physicians rely on an "eyeball test" to assess whether patients will tolerate major surgery or chemotherapy, "eyeballing" is inherently subjective and difficult to quantify. The concept of morphometric age derived from cross-sectional imaging has been found to correlate well with outcomes such as length of stay, morbidity, and mortality. However, the determination of the morphometric age is time intensive and requires highly trained experts. In this study, we propose a fully automated deep learning system for the segmentation of skeletal muscle cross-sectional area (CSA) on an axial computed tomography image taken at the third lumbar vertebra. We utilized a fully automated deep segmentation model derived from an extended implementation of a fully convolutional network with weight initialization of an ImageNet pre-trained model, followed by post processing to eliminate intramuscular fat for a more accurate analysis. This experiment was conducted by varying window level (WL), window width (WW), and bit resolutions in order to better understand the effects of the parameters on the model performance. Our best model, fine-tuned on 250 training images and ground truth labels, achieves 0.93 ± 0.02 Dice similarity coefficient (DSC) and 3.68 ± 2.29% difference between predicted and ground truth muscle CSA on 150 held-out test cases. Ultimately, the fully automated segmentation system can be embedded into the clinical environment to accelerate the quantification of muscle and expanded to volume analysis of 3D datasets.

  7. Use of Computer-Aided Tomography (CT) Imaging for Quantifying Coarse Roots, Rhizomes, Peat, and Particle Densities in Marsh Soils

    Science.gov (United States)

    Computer-aided Tomography (CT) imaging was utilized to quantify wet mass of coarse roots, rhizomes, and peat in cores collected from organic-rich (Jamaica Bay, NY) and mineral (North Inlet, SC) Spartina alterniflora soils. Calibration rods composed of materials with standard dens...

  8. Quantifying Human Performance of a Dynamic Military Target Detection Task: An Application of the Theory of Signal Detection.

    Science.gov (United States)

    1995-06-01

    applied to analyze numerous experimental tasks (Macmillan and Creelman , 1991). One of these tasks, target detection, is the subject research. In...between each associated pair of false alarm rate and hit rate z-scores is d’ for the bias level associated with the pairing (Macmillan and Creelman , 1991...unequal variance in normal distributions (Macmillan and Creelman , 1991). 61 1966). It is described in detail for the interested reader by Green and

  9. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    Science.gov (United States)

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  10. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks.

    Science.gov (United States)

    Devi, D Chitra; Uthariaraj, V Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  11. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    Directory of Open Access Journals (Sweden)

    D. Chitra Devi

    2016-01-01

    Full Text Available Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM, the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  12. Beyond Behavioral Inhibition: A Computer Avatar Task Designed to Assess Behavioral Inhibition Extends to Harm Avoidance

    Directory of Open Access Journals (Sweden)

    Michael Todd Allen

    2017-09-01

    Full Text Available Personality factors such as behavioral inhibition (BI, a temperamental tendency for avoidance in the face of unfamiliar situations, have been identified as risk factors for anxiety disorders. Personality factors are generally identified through self-report inventories. However, this tendency to avoid may affect the accuracy of these self-report inventories. Previously, a computer based task was developed in which the participant guides an on-screen “avatar” through a series of onscreen events; performance on the task could accurately predict participants’ BI, measured by a standard paper and pencil questionnaire (Adult Measure of Behavioral Inhibition, or AMBI. Here, we sought to replicate this finding as well as compare performance on the avatar task to another measure related to BI, the harm avoidance (HA scale of the Tridimensional Personality Questionnaire (TPQ. The TPQ includes HA scales as well as scales assessing reward dependence (RD, novelty seeking (NS and persistence. One hundred and one undergraduates voluntarily completed the avatar task and the paper and pencil inventories in a counter-balanced order. Scores on the avatar task were strongly correlated with BI assessed via the AMBI questionnaire, which replicates prior findings. Females exhibited higher HA scores than males, but did not differ on scores on the avatar task. There was a strong positive relationship between scores on the avatar task and HA scores. One aspect of HA, fear of uncertainty was found to moderately mediate the relationship between AMBI scores and avatar scores. NS had a strong negative relationship with scores on the avatar task, but there was no significant relationship between RD and scores on the avatar task. These findings indicate the effectiveness of the avatar task as a behavioral alternative to self-report measures to assess avoidance. In addition, the use of computer based behavioral tasks are a viable alternative to paper and pencil self

  13. Task-and-role-based access-control model for computational grid

    Institute of Scientific and Technical Information of China (English)

    LONG Tao; HONG Fan; WU Chi; SUN Ling-li

    2007-01-01

    Access control in a grid environment is a challenging issue because the heterogeneous nature and independent administration of geographically dispersed resources in grid require access control to use fine-grained policies. We established a task-and-role-based access-control model for computational grid (CG-TRBAC model), integrating the concepts of role-based access control (RBAC) and task-based access control (TBAC). In this model, condition restrictions are defined and concepts specifically tailored to Workflow Management System are simplified or omitted so that role assignment and security administration fit computational grid better than traditional models; permissions are mutable with the task status and system variables, and can be dynamically controlled. The CG-TRBAC model is proved flexible and extendible. It can implement different control policies. It embodies the security principle of least privilege and executes active dynamic authorization. A task attribute can be extended to satisfy different requirements in a real grid system.

  14. Optimizing the Number of Cooperating Terminals for Energy Aware Task Computing in Wireless Networks

    DEFF Research Database (Denmark)

    Olsen, Anders Brødløs; Fitzek, Frank H. P.; Koch, Peter

    2005-01-01

    It is generally accepted that energy consumption is a significant design constraint for mobile handheld systems, therefore motivations for methods optimizing the energy consumption making better use of the restricted battery resources are evident. A novel concept of distributed task computing...... is previously proposed (D2VS), where the overall idea of selective distribution of tasks among terminals is made. In this paper the optimal number of terminals for cooperative task computing in a wireless network will be investigated. The paper presents an energy model for the proposed scheme. Energy...... consumption of the terminals with respect to their workload and the overhead of distributing tasks among terminals are taken into account. The paper shows, that the number of cooperating terminals is in general limited to a few, though alternating with respect to the various system parameters....

  15. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task

    Science.gov (United States)

    Revechkis, Boris; Aflalo, Tyson NS; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A.

    2014-12-01

    Objective. To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. Approach. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like ‘Face in a Crowd’ task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the ‘Crowd’) using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a ‘Crowd Off’ condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Main results. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Significance. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet

  16. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task.

    Science.gov (United States)

    Revechkis, Boris; Aflalo, Tyson N S; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A

    2014-12-01

    To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like 'Face in a Crowd' task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the 'Crowd') using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a 'Crowd Off' condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.

  17. The importance of task appropriateness in computer-supported collaborative learning

    Directory of Open Access Journals (Sweden)

    Kathy Buckner

    1999-12-01

    Full Text Available The study of learning in collaborative electronic environments is becoming established as Computer Supported Collaborative Learning (CSCL - an emergent sub-discipline of the more established Computer Supported Co-operative Work (CSCW discipline (Webb, 1995. Using computers for the development of shared understanding through collaboration has been explored by Crook who suggests that success may depend partly on having a clearly specified purpose or goal (Crook, 1994. It is our view that the appropriateness of the task given to the student is central to the success or otherwise of the learning experience. However, the tasks that are given to facilitate collaborative learning in face-toface situations are not always suitable for direct transfer to the electronic medium. It may be necessary to consider redesigning these tasks in relation to the medium in which they are to be undertaken and the functionality of the electronic conferencing software used.

  18. Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?

    Directory of Open Access Journals (Sweden)

    Markus A Wenzel

    Full Text Available Brain-computer interfaces (BCIs that are based on event-related potentials (ERPs can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG. Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI, because it would allow software to adapt to the user's interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli.Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions.Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG.The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.

  19. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    NARCIS (Netherlands)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michael Christophe; Guillot, Gilles

    2015-01-01

    In a recent paper, Bradburd et al. (2013) proposed a model to quantify the relative effect ofgeographic and environmental distance on genetic differentiation. Here, we enhance this method in several ways. 1. We modify the covariance model so as to fit better with mainstream geostatistical models and

  20. The employment of a spoken language computer applied to an air traffic control task.

    Science.gov (United States)

    Laveson, J. I.; Silver, C. A.

    1972-01-01

    Assessment of the merits of a limited spoken language (56 words) computer in a simulated air traffic control (ATC) task. An airport zone approximately 60 miles in diameter with a traffic flow simulation ranging from single-engine to commercial jet aircraft provided the workload for the controllers. This research determined that, under the circumstances of the experiments carried out, the use of a spoken-language computer would not improve the controller performance.

  1. Visualizing stressful aspects of repetitive motion tasks and opportunities for ergonomic improvements using computer vision.

    Science.gov (United States)

    Greene, Runyu L; Azari, David P; Hu, Yu Hen; Radwin, Robert G

    2017-11-01

    Patterns of physical stress exposure are often difficult to measure, and the metrics of variation and techniques for identifying them is underdeveloped in the practice of occupational ergonomics. Computer vision has previously been used for evaluating repetitive motion tasks for hand activity level (HAL) utilizing conventional 2D videos. The approach was made practical by relaxing the need for high precision, and by adopting a semi-automatic approach for measuring spatiotemporal characteristics of the repetitive task. In this paper, a new method for visualizing task factors, using this computer vision approach, is demonstrated. After videos are made, the analyst selects a region of interest on the hand to track and the hand location and its associated kinematics are measured for every frame. The visualization method spatially deconstructs and displays the frequency, speed and duty cycle components of tasks that are part of the threshold limit value for hand activity for the purpose of identifying patterns of exposure associated with the specific job factors, as well as for suggesting task improvements. The localized variables are plotted as a heat map superimposed over the video, and displayed in the context of the task being performed. Based on the intensity of the specific variables used to calculate HAL, we can determine which task factors most contribute to HAL, and readily identify those work elements in the task that contribute more to increased risk for an injury. Work simulations and actual industrial examples are described. This method should help practitioners more readily measure and interpret temporal exposure patterns and identify potential task improvements. Copyright © 2017. Published by Elsevier Ltd.

  2. Nonoccurrence of Negotiation of Meaning in Task-Based Synchronous Computer-Mediated Communication

    Science.gov (United States)

    Van Der Zwaard, Rose; Bannink, Anne

    2016-01-01

    This empirical study investigated the occurrence of meaning negotiation in an interactive synchronous computer-mediated second language (L2) environment. Sixteen dyads (N = 32) consisting of nonnative speakers (NNSs) and native speakers (NSs) of English performed 2 different tasks using videoconferencing and written chat. The data were coded and…

  3. Autonomic Modulation in Duchenne Muscular Dystrophy during a Computer Task: A Prospective Control Trial.

    Directory of Open Access Journals (Sweden)

    Mayra Priscila Boscolo Alvarez

    Full Text Available Duchenne Muscular Dystrophy (DMD is characterized by progressive muscle weakness that can lead to disability. Owing to functional difficulties faced by individuals with DMD, the use of assistive technology is essential to provide or facilitate functional abilities. In DMD, cardiac autonomic dysfunction has been reported in addition to musculoskeletal impairment. Consequently, the objective was to investigate acute cardiac autonomic responses, by Heart Rate Variability (HRV, during computer tasks in subjects with DMD.HRV was assessed by linear and nonlinear methods, using the heart rate monitor Polar RS800CX chest strap Electrocardiographic measuring device. Then, 45 subjects were included in the group with DMD and 45 in the healthy Typical Development (TD control group. They were assessed for twenty minutes at rest sitting, and five minutes after undergoing a task on the computer.Individuals with DMD had a statistically significant lower parasympathetic cardiac modulation at rest when compared to the control group, which further declined when undergoing the tasks on the computer.DMD patients presented decreased HRV and exhibited greater intensity of cardiac autonomic responses during computer tasks characterized by vagal withdrawal when compared to the healthy TD control subjects.

  4. From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task

    Science.gov (United States)

    Öman, Anne

    2017-01-01

    Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…

  5. The effect of dynamic workstations on the performance of various computer and office-based tasks

    NARCIS (Netherlands)

    Burford, E.M.; Botter, J.; Commissaris, D.; Könemann, R.; Hiemstra-Van Mastrigt, S.; Ellegast, R.P.

    2013-01-01

    The effect of different workstations, conventional and dynamic, on different types of performance measures for several different office and computer based task was investigated in this research paper. The two dynamic workstations assessed were the Lifespan Treadmill Desk and the RightAngle

  6. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    Science.gov (United States)

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  7. Student Computer Use in Selected Undergraduate Agriculture Courses: An Examination of Required Tasks.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.

    2000-01-01

    Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)

  8. Optimization of Task Scheduling Algorithm through QoS Parameters for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Monika

    2016-01-01

    Full Text Available Cloud computing is an incipient innovation which broadly spreads among analysts. It furnishes clients with foundation, stage and programming as enhancement which is easily available by means of web. A cloud is a sort of parallel and conveyed framework comprising of a gathering of virtualized PCs that are utilized to execute various tasks to accomplish good execution time, accomplish due date and usage of its assets. The scheduling issue can be seen as the finding an ideal task of assignments over the accessible arrangement of assets with the goal that we can accomplish the wanted objectives for tasks. This paper presents an optimal algorithm for scheduling tasks to get their waiting time as a QoS parameter. The algorithm is simulated using Cloudsim simulator and experiments are carried out to help clients to make sense of the bottleneck of utilizing no. of virtual machine parallely.

  9. Performance analysis of cloud computing services for many-tasks scientific computing

    NARCIS (Netherlands)

    Iosup, A.; Ostermann, S.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.

    2011-01-01

    Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike. Through the use of virtualization and resource time sharing, clouds serve with a single set of physical resources a

  10. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  11. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  12. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  13. Task-induced frequency modulation features for brain-computer interfacing

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Objective. Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects’ intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects’ intents with an accuracy comparable to task-induced amplitude modulation. Approach. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. Main results. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Significance. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  14. Task-induced frequency modulation features for brain-computer interfacing.

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects' intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects' intents with an accuracy comparable to task-induced amplitude modulation. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  15. A Hybrid Scheduler for Many Task Computing in Big Data Systems

    Directory of Open Access Journals (Sweden)

    Vasiliu Laura

    2017-06-01

    Full Text Available With the rapid evolution of the distributed computing world in the last few years, the amount of data created and processed has fast increased to petabytes or even exabytes scale. Such huge data sets need data-intensive computing applications and impose performance requirements to the infrastructures that support them, such as high scalability, storage, fault tolerance but also efficient scheduling algorithms. This paper focuses on providing a hybrid scheduling algorithm for many task computing that addresses big data environments with few penalties, taking into consideration the deadlines and satisfying a data dependent task model. The hybrid solution consists of several heuristics and algorithms (min-min, min-max and earliest deadline first combined in order to provide a scheduling algorithm that matches our problem. The experimental results are conducted by simulation and prove that the proposed hybrid algorithm behaves very well in terms of meeting deadlines.

  16. Enhanced computational methods for quantifying the effect of geographic and environmental isolation on genetic differentiation

    DEFF Research Database (Denmark)

    Botta, Filippo; Eriksen, Casper; Fontaine, Michaël C.

    2015-01-01

    1. In a recent paper, Bradburd et al. (Evolution, 67, 2013, 3258) proposed a model to quantify the relative effect of geographic and environmental distance on genetic differentiation. Here, we enhance this method in several ways. 2. We modify the covariance model so as to fit better with mainstre...... available as an R package called sunder. It takes as input georeferenced allele counts at the individual or population level for co-dominant markers. Program homepage: http://www2.imm.dtu.dk/~gigu/Sunder/....

  17. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2010-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  18. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2009-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  19. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    Science.gov (United States)

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.; Blemker, Silvia S.

    2015-01-01

    Purpose: This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method: We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy…

  20. Impact of number of repeated scans on model observer performance for a low-contrast detection task in computed tomography.

    Science.gov (United States)

    Ma, Chi; Yu, Lifeng; Chen, Baiyu; Favazza, Christopher; Leng, Shuai; McCollough, Cynthia

    2016-04-01

    Channelized Hotelling observer (CHO) models have been shown to correlate well with human observers for several phantom-based detection/classification tasks in clinical computed tomography (CT). A large number of repeated scans were used to achieve an accurate estimate of the model's template. The purpose of this study is to investigate how the experimental and CHO model parameters affect the minimum required number of repeated scans. A phantom containing 21 low-contrast objects was scanned on a 128-slice CT scanner at three dose levels. Each scan was repeated 100 times. For each experimental configuration, the low-contrast detectability, quantified as the area under receiver operating characteristic curve, [Formula: see text], was calculated using a previously validated CHO with randomly selected subsets of scans, ranging from 10 to 100. Using [Formula: see text] from the 100 scans as the reference, the accuracy from a smaller number of scans was determined. Our results demonstrated that the minimum number of repeated scans increased when the radiation dose level decreased, object size and contrast level decreased, and the number of channels increased. As a general trend, it increased as the low-contrast detectability decreased. This study provides a basis for the experimental design of task-based image quality assessment in clinical CT using CHO.

  1. 29 CFR 778.313 - Computing overtime pay under the Act for employees compensated on task basis.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Computing overtime pay under the Act for employees compensated on task basis. 778.313 Section 778.313 Labor Regulations Relating to Labor (Continued) WAGE AND... TO REGULATIONS OVERTIME COMPENSATION Special Problems âtaskâ Basis of Payment § 778.313 Computing...

  2. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  3. Computer Aided Measurement Laser (CAML): technique to quantify post-mastectomy lymphoedema

    International Nuclear Information System (INIS)

    Trombetta, Chiara; Abundo, Paolo; Felici, Antonella; Ljoka, Concetta; Foti, Calogero; Cori, Sandro Di; Rosato, Nicola

    2012-01-01

    Lymphoedema can be a side effect of cancer treatment. Eventhough several methods for assessing lymphoedema are used in clinical practice, an objective quantification of lymphoedema has been problematic. The aim of the study was to determine the objectivity, reliability and repeatability of the computer aided measurement laser (CAML) technique. CAML technique is based on computer aided design (CAD) methods and requires an infrared laser scanner. Measurements are scanned and the information describing size and shape of the limb allows to design the model by using the CAD software. The objectivity and repeatability was established in the beginning using a phantom. Consequently a group of subjects presenting post-breast cancer lymphoedema was evaluated using as a control the contralateral limb. Results confirmed that in clinical settings CAML technique is easy to perform, rapid and provides meaningful data for assessing lymphoedema. Future research will include a comparison of upper limb CAML technique between healthy subjects and patients with known lymphoedema.

  4. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    Science.gov (United States)

    2017-08-01

    related to the numerical structuring of a problem, such as cell size, domain extent, and system orientation. Depth of penetration of a threat into a... system in the simulation codes is tied to the domain structure , with coordinate axes aligned with cell edges. However, the position of the coordinate...physical systems are generally described by sets of equations involving continuous variables, such as time and position. Computational simulations

  5. A study on optimal task decomposition of networked parallel computing using PVM

    International Nuclear Information System (INIS)

    Seong, Kwan Jae; Kim, Han Gyoo

    1998-01-01

    A numerical study is performed to investigate the effect of task decomposition on networked parallel processes using Parallel Virtual Machine (PVM). In our study, a PVM program distributed over a network of workstations is used in solving a finite difference version of a one dimensional heat equation, where natural choice of PVM programming structure would be the master-slave paradigm, with the aim of finding an optimal configuration resulting in least computing time including communication overhead among machines. Given a set of PVM tasks comprised of one master and five slave programs, it is found that there exists a pseudo-optimal number of machines, which does not necessarily coincide with the number of tasks, that yields the best performance when the network is under a light usage. Increasing the number of machines beyond this optimal one does not improve computing performance since increase in communication overhead among the excess number of machines offsets the decrease in CPU time obtained by distributing the PVM tasks among these machines. However, when the network traffic is heavy, the results exhibit a more random characteristic that is explained by the random nature of data transfer time

  6. Software designs of image processing tasks with incremental refinement of computation.

    Science.gov (United States)

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  7. Phylodynamics with Migration: A Computational Framework to Quantify Population Structure from Genomic Data.

    Science.gov (United States)

    Kühnert, Denise; Stadler, Tanja; Vaughan, Timothy G; Drummond, Alexei J

    2016-08-01

    When viruses spread, outbreaks can be spawned in previously unaffected regions. Depending on the time and mode of introduction, each regional outbreak can have its own epidemic dynamics. The migration and phylodynamic processes are often intertwined and need to be taken into account when analyzing temporally and spatially structured virus data. In this article, we present a fully probabilistic approach for the joint reconstruction of phylodynamic history in structured populations (such as geographic structure) based on a multitype birth-death process. This approach can be used to quantify the spread of a pathogen in a structured population. Changes in epidemic dynamics through time within subpopulations are incorporated through piecewise constant changes in transmission parameters.We analyze a global human influenza H3N2 virus data set from a geographically structured host population to demonstrate how seasonal dynamics can be inferred simultaneously with the phylogeny and migration process. Our results suggest that the main migration path among the northern, tropical, and southern region represented in the sample analyzed here is the one leading from the tropics to the northern region. Furthermore, the time-dependent transmission dynamics between and within two HIV risk groups, heterosexuals and injecting drug users, in the Latvian HIV epidemic are investigated. Our analyses confirm that the Latvian HIV epidemic peaking around 2001 was mainly driven by the injecting drug user risk group. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  8. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  9. Thinking processes used by high-performing students in a computer programming task

    Directory of Open Access Journals (Sweden)

    Marietjie Havenga

    2011-07-01

    Full Text Available Computer programmers must be able to understand programming source code and write programs that execute complex tasks to solve real-world problems. This article is a trans- disciplinary study at the intersection of computer programming, education and psychology. It outlines the role of mental processes in the process of programming and indicates how successful thinking processes can support computer science students in writing correct and well-defined programs. A mixed methods approach was used to better understand the thinking activities and programming processes of participating students. Data collection involved both computer programs and students’ reflective thinking processes recorded in their journals. This enabled analysis of psychological dimensions of participants’ thinking processes and their problem-solving activities as they considered a programming problem. Findings indicate that the cognitive, reflective and psychological processes used by high-performing programmers contributed to their success in solving a complex programming problem. Based on the thinking processes of high performers, we propose a model of integrated thinking processes, which can support computer programming students. Keywords: Computer programming, education, mixed methods research, thinking processes.  Disciplines: Computer programming, education, psychology

  10. An Adaptive Procedure for Task Scheduling Optimization in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Pham Phuoc Hung

    2015-01-01

    Full Text Available Nowadays, mobile cloud computing (MCC has emerged as a new paradigm which enables offloading computation-intensive, resource-consuming tasks up to a powerful computing platform in cloud, leaving only simple jobs to the capacity-limited thin client devices such as smartphones, tablets, Apple’s iWatch, and Google Glass. However, it still faces many challenges due to inherent problems of thin clients, especially the slow processing and low network connectivity. So far, a number of research studies have been carried out, trying to eliminate these problems, yet few have been found efficient. In this paper, we present an enhanced architecture, taking advantage of collaboration of thin clients and conventional desktop or laptop computers, known as thick clients, particularly aiming at improving cloud access. Additionally, we introduce an innovative genetic approach for task scheduling such that the processing time is minimized, while considering network contention and cloud cost. Our simulation shows that the proposed approach is more cost-effective and achieves better performance compared with others.

  11. Exploring methodological frameworks for a mental task-based near-infrared spectroscopy brain-computer interface.

    Science.gov (United States)

    Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom

    2015-10-30

    Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Computer-assisted diagnostic tool to quantify the pulmonary veins in sickle cell associated pulmonary hypertension

    Science.gov (United States)

    Jajamovich, Guido H.; Pamulapati, Vivek; Alam, Shoaib; Mehari, Alem; Kato, Gregory J.; Wood, Bradford J.; Linguraru, Marius George

    2012-03-01

    Pulmonary hypertension is a common cause of death among patients with sickle cell disease. This study investigates the use of pulmonary vein analysis to assist the diagnosis of pulmonary hypertension non-invasively with CT-Angiography images. The characterization of the pulmonary veins from CT presents two main challenges. Firstly, the number of pulmonary veins is unknown a priori and secondly, the contrast material is degraded when reaching the pulmonary veins, making the edges of these vessels to appear faint. Each image is first denoised and a fast marching approach is used to segment the left atrium and pulmonary veins. Afterward, a geodesic active contour is employed to isolate the left atrium. A thinning technique is then used to extract the skeleton of the atrium and the veins. The locations of the pulmonary veins ostia are determined by the intersection of the skeleton and the contour of the atrium. The diameters of the pulmonary veins are measured in each vein at fixed distances from the corresponding ostium, and for each distance, the sum of the diameters of all the veins is computed. These indicators are shown to be significantly larger in sickle-cell patients with pulmonary hypertension as compared to controls (p-values < 0.01).

  13. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

    Directory of Open Access Journals (Sweden)

    Andrea Stocco

    2018-04-01

    Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  14. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Science.gov (United States)

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  15. Service task partition and distribution in star topology computer grid subject to data security constraints

    Energy Technology Data Exchange (ETDEWEB)

    Xiang Yanping [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Levitin, Gregory, E-mail: levitin@iec.co.il [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Israel electric corporation, P. O. Box 10, Haifa 31000 (Israel)

    2011-11-15

    The paper considers grid computing systems in which the resource management systems (RMS) can divide service tasks into execution blocks (EBs) and send these blocks to different resources. In order to provide a desired level of service reliability the RMS can assign the same blocks to several independent resources for parallel execution. The data security is a crucial issue in distributed computing that affects the execution policy. By the optimal service task partition into the EBs and their distribution among resources, one can achieve the greatest possible service reliability and/or expected performance subject to data security constraints. The paper suggests an algorithm for solving this optimization problem. The algorithm is based on the universal generating function technique and on the evolutionary optimization approach. Illustrative examples are presented. - Highlights: > Grid service with star topology is considered. > An algorithm for evaluating service reliability and data security is presented. > A tradeoff between the service reliability and data security is analyzed. > A procedure for optimal service task partition and distribution is suggested.

  16. Service task partition and distribution in star topology computer grid subject to data security constraints

    International Nuclear Information System (INIS)

    Xiang Yanping; Levitin, Gregory

    2011-01-01

    The paper considers grid computing systems in which the resource management systems (RMS) can divide service tasks into execution blocks (EBs) and send these blocks to different resources. In order to provide a desired level of service reliability the RMS can assign the same blocks to several independent resources for parallel execution. The data security is a crucial issue in distributed computing that affects the execution policy. By the optimal service task partition into the EBs and their distribution among resources, one can achieve the greatest possible service reliability and/or expected performance subject to data security constraints. The paper suggests an algorithm for solving this optimization problem. The algorithm is based on the universal generating function technique and on the evolutionary optimization approach. Illustrative examples are presented. - Highlights: → Grid service with star topology is considered. → An algorithm for evaluating service reliability and data security is presented. → A tradeoff between the service reliability and data security is analyzed. → A procedure for optimal service task partition and distribution is suggested.

  17. A computer simulation approach to quantify the true area and true area compressibility modulus of biological membranes

    International Nuclear Information System (INIS)

    Chacón, Enrique; Tarazona, Pedro; Bresme, Fernando

    2015-01-01

    We present a new computational approach to quantify the area per lipid and the area compressibility modulus of biological membranes. Our method relies on the analysis of the membrane fluctuations using our recently introduced coupled undulatory (CU) mode [Tarazona et al., J. Chem. Phys. 139, 094902 (2013)], which provides excellent estimates of the bending modulus of model membranes. Unlike the projected area, widely used in computer simulations of membranes, the CU area is thermodynamically consistent. This new area definition makes it possible to accurately estimate the area of the undulating bilayer, and the area per lipid, by excluding any contributions related to the phospholipid protrusions. We find that the area per phospholipid and the area compressibility modulus features a negligible dependence with system size, making possible their computation using truly small bilayers, involving a few hundred lipids. The area compressibility modulus obtained from the analysis of the CU area fluctuations is fully consistent with the Hooke’s law route. Unlike existing methods, our approach relies on a single simulation, and no a priori knowledge of the bending modulus is required. We illustrate our method by analyzing 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine bilayers using the coarse grained MARTINI force-field. The area per lipid and area compressibility modulus obtained with our method and the MARTINI forcefield are consistent with previous studies of these bilayers

  18. Microsyntactic Annotation of Corpora and its Use in Computational Linguistics Tasks

    Directory of Open Access Journals (Sweden)

    Iomdin Leonid

    2017-12-01

    Full Text Available Microsyntax is a linguistic discipline dealing with idiomatic elements whose important properties are strongly related to syntax. In a way, these elements may be viewed as transitional entities between the lexicon and the grammar, which explains why they are often underrepresented in both of these resource types: the lexicographer fails to see such elements as full-fledged lexical units, while the grammarian finds them too specific to justify the creation of individual well-developed rules. As a result, such elements are poorly covered by linguistic models used in advanced modern computational linguistic tasks like high-quality machine translation or deep semantic analysis. A possible way to mend the situation and improve the coverage and adequate treatment of microsyntactic units in linguistic resources is to develop corpora with microsyntactic annotation, closely linked to specially designed lexicons. The paper shows how this task is solved in the deeply annotated corpus of Russian, SynTagRus.

  19. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  20. Quantifying kinematics of purposeful movements to real, imagined, or absent functional objects: implications for modelling trajectories for robot-assisted ADL tasks.

    Science.gov (United States)

    Wisneski, Kimberly J; Johnson, Michelle J

    2007-03-23

    Robotic therapy is at the forefront of stroke rehabilitation. The Activities of Daily Living Exercise Robot (ADLER) was developed to improve carryover of gains after training by combining the benefits of Activities of Daily Living (ADL) training (motivation and functional task practice with real objects), with the benefits of robot mediated therapy (repeatability and reliability). In combining these two therapy techniques, we seek to develop a new model for trajectory generation that will support functional movements to real objects during robot training. We studied natural movements to real objects and report on how initial reaching movements are affected by real objects and how these movements deviate from the straight line paths predicted by the minimum jerk model, typically used to generate trajectories in robot training environments. We highlight key issues that to be considered in modelling natural trajectories. Movement data was collected as eight normal subjects completed ADLs such as drinking and eating. Three conditions were considered: object absent, imagined, and present. This data was compared to predicted trajectories generated from implementing the minimum jerk model. The deviations in both the plane of the table (XY) and the sagittal plane of torso (XZ) were examined for both reaches to a cup and to a spoon. Velocity profiles and curvature were also quantified for all trajectories. We hypothesized that movements performed with functional task constraints and objects would deviate from the minimum jerk trajectory model more than those performed under imaginary or object absent conditions. Trajectory deviations from the predicted minimum jerk model for these reaches were shown to depend on three variables: object presence, object orientation, and plane of movement. When subjects completed the cup reach their movements were more curved than for the spoon reach. The object present condition for the cup reach showed more curvature than in the object

  1. Quantifying kinematics of purposeful movements to real, imagined, or absent functional objects: Implications for modelling trajectories for robot-assisted ADL tasks**

    Directory of Open Access Journals (Sweden)

    Wisneski Kimberly J

    2007-03-01

    Full Text Available Abstract Background Robotic therapy is at the forefront of stroke rehabilitation. The Activities of Daily Living Exercise Robot (ADLER was developed to improve carryover of gains after training by combining the benefits of Activities of Daily Living (ADL training (motivation and functional task practice with real objects, with the benefits of robot mediated therapy (repeatability and reliability. In combining these two therapy techniques, we seek to develop a new model for trajectory generation that will support functional movements to real objects during robot training. We studied natural movements to real objects and report on how initial reaching movements are affected by real objects and how these movements deviate from the straight line paths predicted by the minimum jerk model, typically used to generate trajectories in robot training environments. We highlight key issues that to be considered in modelling natural trajectories. Methods Movement data was collected as eight normal subjects completed ADLs such as drinking and eating. Three conditions were considered: object absent, imagined, and present. This data was compared to predicted trajectories generated from implementing the minimum jerk model. The deviations in both the plane of the table (XY and the saggital plane of torso (XZ were examined for both reaches to a cup and to a spoon. Velocity profiles and curvature were also quantified for all trajectories. Results We hypothesized that movements performed with functional task constraints and objects would deviate from the minimum jerk trajectory model more than those performed under imaginary or object absent conditions. Trajectory deviations from the predicted minimum jerk model for these reaches were shown to depend on three variables: object presence, object orientation, and plane of movement. When subjects completed the cup reach their movements were more curved than for the spoon reach. The object present condition for the cup

  2. Quantifying the distribution of paste-void spacing of hardened cement paste using X-ray computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Tae Sup, E-mail: taesup@yonsei.ac.kr [School of Civil and Environmental Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul, 120-749 (Korea, Republic of); Kim, Kwang Yeom, E-mail: kimky@kict.re.kr [Korea Institute of Construction Technology, 283 Goyangdae-ro, Ilsanseo-gu, Goyang, 411-712 (Korea, Republic of); Choo, Jinhyun, E-mail: jinhyun@stanford.edu [Department of Civil and Environmental Engineering, Stanford University, Stanford, CA 94305 (United States); Kang, Dong Hun, E-mail: timeriver@naver.com [School of Civil and Environmental Engineering, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul, 120-749 (Korea, Republic of)

    2012-11-15

    The distribution of paste-void spacing in cement-based materials is an important feature related to the freeze-thaw durability of these materials, but its reliable estimation remains an unresolved problem. Herein, we evaluate the capability of X-ray computed tomography (CT) for reliable quantification of the distribution of paste-void spacing. Using X-ray CT images of three mortar specimens having different air-entrainment characteristics, we calculate the distributions of paste-void spacing of the specimens by applying previously suggested methods for deriving the exact spacing of air-void systems. This methodology is assessed by comparing the 95th percentile of the cumulative distribution function of the paste-void spacing with spacing factors computed by applying the linear-traverse method to 3D air-void system and reconstructing equivalent air-void distribution in 3D. Results show that the distributions of equivalent void diameter and paste-void spacing follow lognormal and normal distributions, respectively, and the ratios between the 95th percentile paste-void spacing value and the spacing factors reside within the ranges reported by previous numerical studies. This experimental finding indicates that the distribution of paste-void spacing quantified using X-ray CT has the potential to be the basis for a statistical assessment of the freeze-thaw durability of cement-based materials. - Highlights: Black-Right-Pointing-Pointer The paste-void spacing in 3D can be quantified by X-ray CT. Black-Right-Pointing-Pointer The distribution of the paste-void spacing follows normal distribution. Black-Right-Pointing-Pointer The spacing factor and 95th percentile of CDF of paste-void spacing are correlated.

  3. Relating UMLS semantic types and task-based ontology to computer-interpretable clinical practice guidelines.

    Science.gov (United States)

    Kumar, Anand; Ciccarese, Paolo; Quaglini, Silvana; Stefanelli, Mario; Caffi, Ezio; Boiocchi, Lorenzo

    2003-01-01

    Medical knowledge in clinical practice guideline (GL) texts is the source of task-based computer-interpretable clinical guideline models (CIGMs). We have used Unified Medical Language System (UMLS) semantic types (STs) to understand the percentage of GL text which belongs to a particular ST. We also use UMLS semantic network together with the CIGM-specific ontology to derive a semantic meaning behind the GL text. In order to achieve this objective, we took nine GL texts from the National Guideline Clearinghouse (NGC) and marked up the text dealing with a particular ST. The STs we took into consideration were restricted taking into account the requirements of a task-based CIGM. We used DARPA Agent Markup Language and Ontology Inference Layer (DAML + OIL) to create the UMLS and CIGM specific semantic network. For the latter, as a bench test, we used the 1999 WHO-International Society of Hypertension Guidelines for the Management of Hypertension. We took into consideration the UMLS STs closest to the clinical tasks. The percentage of the GL text dealing with the ST "Health Care Activity" and subtypes "Laboratory Procedure", "Diagnostic Procedure" and "Therapeutic or Preventive Procedure" were measured. The parts of text belonging to other STs or comments were separated. A mapping of terms belonging to other STs was done to the STs under "HCA" for representation in DAML + OIL. As a result, we found that the three STs under "HCA" were the predominant STs present in the GL text. In cases where the terms of related STs existed, they were mapped into one of the three STs. The DAML + OIL representation was able to describe the hierarchy in task-based CIGMs. To conclude, we understood that the three STs could be used to represent the semantic network of the task-bases CIGMs. We identified some mapping operators which could be used for the mapping of other STs into these.

  4. A Computational Framework for Quantifying and Optimizing the Performance of Observational Networks in 4D-Var Data Assimilation

    Science.gov (United States)

    Cioaca, Alexandru

    A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimila- tion is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as

  5. Metaheuristic Based Scheduling Meta-Tasks in Distributed Heterogeneous Computing Systems

    Directory of Open Access Journals (Sweden)

    Hesam Izakian

    2009-07-01

    Full Text Available Scheduling is a key problem in distributed heterogeneous computing systems in order to benefit from the large computing capacity of such systems and is an NP-complete problem. In this paper, we present a metaheuristic technique, namely the Particle Swarm Optimization (PSO algorithm, for this problem. PSO is a population-based search algorithm based on the simulation of the social behavior of bird flocking and fish schooling. Particles fly in problem search space to find optimal or near-optimal solutions. The scheduler aims at minimizing makespan, which is the time when finishes the latest task. Experimental studies show that the proposed method is more efficient and surpasses those of reported PSO and GA approaches for this problem.

  6. One Task, Divergent Solutions: High- versus Low-Status Sources and Social Comparison Guide Adaptation in a Computer-Supported Socio-Cognitive Conflict Task

    Science.gov (United States)

    Baumeister, Antonia E.; Engelmann, Tanja; Hesse, Friedrich W.

    2017-01-01

    This experimental study extends conflict elaboration theory (1) by revealing social influence dynamics for a knowledge-rich computer-supported socio-cognitive conflict task not investigated in the context of this theory before and (2) by showing the impact of individual differences in social comparison orientation. Students in two conditions…

  7. A Multilevel Modeling Approach to Examining Individual Differences in Skill Acquisition for a Computer-Based Task

    OpenAIRE

    Nair, Sankaran N.; Czaja, Sara J.; Sharit, Joseph

    2007-01-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50–80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performan...

  8. A novel task-oriented optimal design for P300-based brain-computer interfaces.

    Science.gov (United States)

    Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen

    2014-10-01

    Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.

  9. A method to quantify mechanobiologic forces during zebrafish cardiac development using 4-D light sheet imaging and computational modeling.

    Directory of Open Access Journals (Sweden)

    Vijay Vedula

    2017-10-01

    Full Text Available Blood flow and mechanical forces in the ventricle are implicated in cardiac development and trabeculation. However, the mechanisms of mechanotransduction remain elusive. This is due in part to the challenges associated with accurately quantifying mechanical forces in the developing heart. We present a novel computational framework to simulate cardiac hemodynamics in developing zebrafish embryos by coupling 4-D light sheet imaging with a stabilized finite element flow solver, and extract time-dependent mechanical stimuli data. We employ deformable image registration methods to segment the motion of the ventricle from high resolution 4-D light sheet image data. This results in a robust and efficient workflow, as segmentation need only be performed at one cardiac phase, while wall position in the other cardiac phases is found by image registration. Ventricular hemodynamics are then quantified by numerically solving the Navier-Stokes equations in the moving wall domain with our validated flow solver. We demonstrate the applicability of the workflow in wild type zebrafish and three treated fish types that disrupt trabeculation: (a chemical treatment using AG1478, an ErbB2 signaling inhibitor that inhibits proliferation and differentiation of cardiac trabeculation; (b injection of gata1a morpholino oligomer (gata1aMO suppressing hematopoiesis and resulting in attenuated trabeculation; (c weak-atriumm58 mutant (wea with inhibited atrial contraction leading to a highly undeveloped ventricle and poor cardiac function. Our simulations reveal elevated wall shear stress (WSS in wild type and AG1478 compared to gata1aMO and wea. High oscillatory shear index (OSI in the grooves between trabeculae, compared to lower values on the ridges, in the wild type suggest oscillatory forces as a possible regulatory mechanism of cardiac trabeculation development. The framework has broad applicability for future cardiac developmental studies focused on quantitatively

  10. IMPORTANCE OF COMPUTER TECHNOLOGY IN REALIZATION OF CULTURAL AND EDUCATIONAL TASKS OF PRESCHOOL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Zvezdan Arsić

    2016-06-01

    Full Text Available The rapid scientific and technological development imposes numerous changes in all spheres of life and work. In such circumstances, a computer has become a part of all aspects of life: economy, education, free time, family. Since children in contemporary society increasingly acquire knowledge before the school age, the question is how to prepare them for the world in which we live, bearing in mind how significantly different it is from the world in which the previous generations grew up. The research was aimed at examining the attitudes of preschool teachers about the importance of computers in the realization of educational activities in preschool institutions. The study included 54 teachers from Kosovo and Metohija: Kosovska Mitrovica, Donja Gušterica and Ropotovo. The research results indicate that digital technology is a very important and a useful didactic tool in the realization of educational activities in preschool institutions and that preschool teachers have the required competence to implement the technology. However, they are not satisfied with the quality of their ICT education and training during their studies; they also feel that their institutions do not provide adequate working conditions for the use of computers in the realization of educational tasks.

  11. A novel computational approach of image analysis to quantify behavioural response to heat shock in Chironomus Ramosus larvae (Diptera: Chironomidae

    Directory of Open Access Journals (Sweden)

    Bimalendu B. Nath

    2015-07-01

    Full Text Available All living cells respond to temperature stress through coordinated cellular, biochemical and molecular events known as “heat shock response” and its genetic basis has been found to be evolutionarily conserved. Despite marked advances in stress research, this ubiquitous heat shock response has never been analysed quantitatively at the whole organismal level using behavioural correlates. We have investigated behavioural response to heat shock in a tropical midge Chironomus ramosus Chaudhuri, Das and Sublette. The filter-feeding aquatic Chironomus larvae exhibit characteristic undulatory movement. This innate pattern of movement was taken as a behavioural parameter in the present study. We have developed a novel computer-aided image analysis tool “Chiro” for the quantification of behavioural responses to heat shock. Behavioural responses were quantified by recording the number of undulations performed by each larva per unit time at a given ambient temperature. Quantitative analysis of undulation frequency was carried out and this innate behavioural pattern was found to be modulated as a function of ambient temperature. Midge larvae are known to be bioindicators of aquatic environments. Therefore, the “Chiro” technique can be tested using other potential biomonitoring organisms obtained from natural aquatic habitats using undulatory motion as a behavioural parameter.

  12. Motivation and engagement in computer-based learning tasks: investigating key contributing factors

    Directory of Open Access Journals (Sweden)

    Michela Ott, Mauro Tavella

    2010-04-01

    Full Text Available This paper, drawing on a research project concerning the educational use of digital mind games with primary school students, aims at giving a contribution to the understanding of which are the main factors influencing student motivation during computer-based learning activities. It puts forward some ideas and experience based reflections, starting by considering digital games that are widely recognized as the most promising ICT tools to enhance student motivation. The project results suggest that student genuine engagement in learning activities is mainly related to the actual possession of the skills and of the cognitive capacities needed to perform the task. In this perspective, cognitive overload should be regarded as one of the main reasons contributing to hinder student motivation and, consequently, should be avoided. Other elements such as game attractiveness and experimental setting constraints resulted to have a lower effect on student motivation.

  13. Human factors with nonhumans - Factors that affect computer-task performance

    Science.gov (United States)

    Washburn, David A.

    1992-01-01

    There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

  14. Independent tasks scheduling in cloud computing via improved estimation of distribution algorithm

    Science.gov (United States)

    Sun, Haisheng; Xu, Rui; Chen, Huaping

    2018-04-01

    To minimize makespan for scheduling independent tasks in cloud computing, an improved estimation of distribution algorithm (IEDA) is proposed to tackle the investigated problem in this paper. Considering that the problem is concerned with multi-dimensional discrete problems, an improved population-based incremental learning (PBIL) algorithm is applied, which the parameter for each component is independent with other components in PBIL. In order to improve the performance of PBIL, on the one hand, the integer encoding scheme is used and the method of probability calculation of PBIL is improved by using the task average processing time; on the other hand, an effective adaptive learning rate function that related to the number of iterations is constructed to trade off the exploration and exploitation of IEDA. In addition, both enhanced Max-Min and Min-Min algorithms are properly introduced to form two initial individuals. In the proposed IEDA, an improved genetic algorithm (IGA) is applied to generate partial initial population by evolving two initial individuals and the rest of initial individuals are generated at random. Finally, the sampling process is divided into two parts including sampling by probabilistic model and IGA respectively. The experiment results show that the proposed IEDA not only gets better solution, but also has faster convergence speed.

  15. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  16. Positron computed tomography studies of cerebral metabolic responses to complex motor tasks

    International Nuclear Information System (INIS)

    Phelps, M.E.; Mazziotta, J.C.

    1984-01-01

    Human motor system organization was explored in 8 right-handed male subjects using /sup 18/F-fluorodeoxyglucose and positron computed tomography to measure cerebral glucose metabolism. Five subjects had triple studies (eyes closed) including: control (hold pen in right hand without moving), normal size writing (subject repeatedly writes name) and large (10-15 X normal) name writing. In these studies normal and large size writing had a similar distribution of metabolic responses when compared to control studies. Activations (percent change from control) were in the range of 12-20% and occurred in the striatum bilaterally > contralateral Rolandic cortex > contralateral thalamus. No significant activations were observed in the ipsilateral thalamus, Rolandic cortex or cerebellum (supplementary motor cortex was not examined). The magnitude of the metabolic response in the striatum was greater with the large versus normal sized writing. This differential response may be due to an increased number and topographic distribution of neurons responding with the same average activity between tasks or an increase in the functional activity of the same neuronal population between the two tasks (present spatial resolution inadequate to differentiate). When subjects (N=3) performed novel sequential finger movements, the maximal metabolic response was in the contralateral Rolandic cortex > striatum. Such studies provide a means of exploring human motor system organization, motor learning and provide a basis for examining patients with motor system disorders

  17. Brain-computer interface analysis of a dynamic visuo-motor task.

    Science.gov (United States)

    Logar, Vito; Belič, Aleš

    2011-01-01

    The area of brain-computer interfaces (BCIs) represents one of the more interesting fields in neurophysiological research, since it investigates the development of the machines that perform different transformations of the brain's "thoughts" to certain pre-defined actions. Experimental studies have reported some successful implementations of BCIs; however, much of the field still remains unexplored. According to some recent reports the phase coding of informational content is an important mechanism in the brain's function and cognition, and has the potential to explain various mechanisms of the brain's data transfer, but it has yet to be scrutinized in the context of brain-computer interface. Therefore, if the mechanism of phase coding is plausible, one should be able to extract the phase-coded content, carried by brain signals, using appropriate signal-processing methods. In our previous studies we have shown that by using a phase-demodulation-based signal-processing approach it is possible to decode some relevant information on the current motor action in the brain from electroencephalographic (EEG) data. In this paper the authors would like to present a continuation of their previous work on the brain-information-decoding analysis of visuo-motor (VM) tasks. The present study shows that EEG data measured during more complex, dynamic visuo-motor (dVM) tasks carries enough information about the currently performed motor action to be successfully extracted by using the appropriate signal-processing and identification methods. The aim of this paper is therefore to present a mathematical model, which by means of the EEG measurements as its inputs predicts the course of the wrist movements as applied by each subject during the task in simulated or real time (BCI analysis). However, several modifications to the existing methodology are needed to achieve optimal decoding results and a real-time, data-processing ability. The information extracted from the EEG could

  18. On the development of a computer-based handwriting assessment tool to objectively quantify handwriting proficiency in children.

    Science.gov (United States)

    Falk, Tiago H; Tam, Cynthia; Schellnus, Heidi; Chau, Tom

    2011-12-01

    Standardized writing assessments such as the Minnesota Handwriting Assessment (MHA) can inform interventions for handwriting difficulties, which are prevalent among school-aged children. However, these tests usually involve the laborious task of subjectively rating the legibility of the written product, precluding their practical use in some clinical and educational settings. This study describes a portable computer-based handwriting assessment tool to objectively measure MHA quality scores and to detect handwriting difficulties in children. Several measures are proposed based on spatial, temporal, and grip force measurements obtained from a custom-built handwriting instrument. Thirty-five first and second grade students participated in the study, nine of whom exhibited handwriting difficulties. Students performed the MHA test and were subjectively scored based on speed and handwriting quality using five primitives: legibility, form, alignment, size, and space. Several spatial parameters are shown to correlate significantly (phandwriting legibility and speed, respectively. Using only size and space parameters, promising discrimination between proficient and non-proficient handwriting can be achieved. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. X-ray computed tomography uncovers root-root interactions: quantifying spatial relationships between interacting root systems in three dimensions.

    Science.gov (United States)

    Paya, Alexander M; Silverberg, Jesse L; Padgett, Jennifer; Bauerle, Taryn L

    2015-01-01

    Research in the field of plant biology has recently demonstrated that inter- and intra-specific interactions belowground can dramatically alter root growth. Our aim was to answer questions related to the effect of inter- vs. intra-specific interactions on the growth and utilization of undisturbed space by fine roots within three dimensions (3D) using micro X-ray computed tomography. To achieve this, Populus tremuloides (quaking aspen) and Picea mariana (black spruce) seedlings were planted into containers as either solitary individuals, or inter-/intra-specific pairs, allowed to grow for 2 months, and 3D metrics developed in order to quantify their use of belowground space. In both aspen and spruce, inter-specific root interactions produced a shift in the vertical distribution of the root system volume, and deepened the average position of root tips when compared to intra-specifically growing seedlings. Inter-specific interactions also increased the minimum distance between root tips belonging to the same root system. There was no effect of belowground interactions on the radial distribution of roots, or the directionality of lateral root growth for either species. In conclusion, we found that significant differences were observed more often when comparing controls (solitary individuals) and paired seedlings (inter- or intra-specific), than when comparing inter- and intra-specifically growing seedlings. This would indicate that competition between neighboring seedlings was more responsible for shifting fine root growth in both species than was neighbor identity. However, significant inter- vs. intra-specific differences were observed, which further emphasizes the importance of biological interactions in competition studies.

  20. Quantifying differences between computational results and measurements in the case of a large-scale well-confined fire scenario

    International Nuclear Information System (INIS)

    Audouin, L.; Chandra, L.; Consalvi, J.-L.; Gay, L.; Gorza, E.; Hohm, V.; Hostikka, S.; Ito, T.; Klein-Hessling, W.; Lallemand, C.; Magnusson, T.; Noterman, N.; Park, J.S.; Peco, J.; Rigollet, L.; Suard, S.; Van-Hees, P.

    2011-01-01

    Research Highlights: → We performed a numerical benchmark in the framework of an OECD experimental program of a pool fire in a well-confined compartment. → The benchmark involves 17 participants using 8 fire models, 3 CFD and 5 zone models. → We investigated the capabilities of validation metrics for a real large-scale fire. → Six quantities were compared during the whole fire duration. → It is important to consider more than one metric for the validation process. - Abstract: The objective of this work was to quantify comparisons between several computational results and measurements performed during a pool fire scenario in a well-confined compartment. This collaborative work was initiated under the framework of the OECD fire research program and involves the most frequently used fire models in the fire community, including field and zone models. The experimental scenario was conducted at the French Institut de Radioprotection et de Surete Nucleaire (IRSN) and deals with a full-scale liquid pool fire in a confined and mechanically ventilated compartment representative for nuclear plants. The practical use of different metric operators and their ability to report the capabilities of fire models are presented. The quantitative comparisons between measurements and numerical results obtained from 'open' calculations concern six important quantities from a safety viewpoint: gas temperature, oxygen concentration, wall temperature, total heat flux, compartment pressure and ventilation flow rate during the whole fire duration. The results indicate that it is important to use more than one metric for the validation process in order to get information on the uncertainties associated with different aspects of fire safety.

  1. Quantifying Hyporheic Exchanges in a Large Scale River Reach Using Coupled 3-D Surface and Subsurface Computational Fluid Dynamics Simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn Edward; Bao, J; Huang, M; Hou, Z; Perkins, W; Harding, S; Titzler, S; Ren, H; Thorne, P; Suffield, S; Murray, C; Zachara, J

    2017-03-01

    Hyporheic exchange is a critical mechanism shaping hydrological and biogeochemical processes along a river corridor. Recent studies on quantifying the hyporheic exchange were mostly limited to local scales due to field inaccessibility, computational demand, and complexity of geomorphology and subsurface geology. Surface flow conditions and subsurface physical properties are well known factors on modulating the hyporheic exchange, but quantitative understanding of their impacts on the strength and direction of hyporheic exchanges at reach scales is absent. In this study, a high resolution computational fluid dynamics (CFD) model that couples surface and subsurface flow and transport is employed to simulate hyporheic exchanges in a 7-km long reach along the main-stem of the Columbia River. Assuming that the hyporheic exchange does not affect surface water flow conditions due to its negligible magnitude compared to the volume and velocity of river water, we developed a one-way coupled surface and subsurface water flow model using the commercial CFD software STAR-CCM+. The model integrates the Reynolds-averaged Navier-Stokes (RANS) equation solver with a realizable κ-ε two-layer turbulence model, a two-layer all y+ wall treatment, and the volume of fluid (VOF) method, and is used to simulate hyporheic exchanges by tracking the free water-air interface as well as flow in the river and the subsurface porous media. The model is validated against measurements from acoustic Doppler current profiler (ADCP) in the stream water and hyporheic fluxes derived from a set of temperature profilers installed across the riverbed. The validated model is then employed to systematically investigate how hyporheic exchanges are influenced by surface water fluid dynamics strongly regulated by upstream dam operations, as well as subsurface structures (e.g. thickness of riverbed and subsurface formation layers) and hydrogeological properties (e.g. permeability). The results

  2. Simultaneous Budget and Buffer Size Computation for Throughput-Constrained Task Graphs

    NARCIS (Netherlands)

    Wiggers, M.H.; Bekooij, Marco Jan Gerrit; Geilen, Marc C.W.; Basten, Twan

    Modern embedded multimedia systems process multiple concurrent streams of data processing jobs. Streams often have throughput requirements. These jobs are implemented on a multiprocessor system as a task graph. Tasks communicate data over buffers, where tasks wait on sufficient space in output

  3. Computation of Buffer Capacities for Throughput Constrained and Data Dependent Inter-Task Communication

    NARCIS (Netherlands)

    Wiggers, M.H.; Bekooij, Marco Jan Gerrit; Bekooij, Marco J.G.; Smit, Gerardus Johannes Maria

    2008-01-01

    Streaming applications are often implemented as task graphs. Currently, techniques exist to derive buffer capacities that guarantee satisfaction of a throughput constraint for task graphs in which the inter-task communication is data-independent, i.e. the amount of data produced and consumed is

  4. Interfractional Position Variation of Pancreatic Tumors Quantified Using Intratumoral Fiducial Markers and Daily Cone Beam Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Horst, Astrid van der, E-mail: a.vanderhorst@amc.uva.nl [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Wognum, Silvia; Dávila Fajardo, Raquel; Jong, Rianne de [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Hooft, Jeanin E. van; Fockens, Paul [Department of Gastroenterology and Hepatology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Tienhoven, Geertjan van; Bel, Arjan [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands)

    2013-09-01

    Purpose: The aim of this study was to quantify interfractional pancreatic position variation using fiducial markers visible on daily cone beam computed tomography (CBCT) scans. In addition, we analyzed possible migration of the markers to investigate their suitability for tumor localization. Methods and Materials: For 13 pancreatic cancer patients with implanted Visicoil markers, CBCT scans were obtained before 17 to 25 fractions (300 CBCTs in total). Image registration with the reference CT was used to determine the displacement of the 2 to 3 markers relative to bony anatomy and to each other. We analyzed the distance between marker pairs as a function of time to identify marker registration error (SD of linear fit residuals) and possible marker migration. For each patient, we determined the mean displacement of markers relative to the reference CT (systematic position error) and the spread in displacements (random position error). From this, we calculated the group systematic error, Σ, and group random error, σ. Results: Marker pair distances showed slight trends with time (range, −0.14 to 0.14 mm/day), possibly due to tissue deformation, but no shifts that would indicate marker migration. The mean SD of the fit residuals was 0.8 mm. We found large interfractional position variations, with for 116 of 300 (39%) fractions a 3-dimensional vector displacement of >10 mm. The spread in displacement varied significantly (P<.01) between patients, from a vector range of 9.1 mm to one of 24.6 mm. For the patient group, Σ was 3.8, 6.6, and 3.5 mm; and σ was 3.6, 4.7 and 2.5 mm, in left–right, superior–inferior, and anterior–posterior directions, respectively. Conclusions: We found large systematic displacements of the fiducial markers relative to bony anatomy, in addition to wide distributions of displacement. These results for interfractional position variation confirm the potential benefit of using fiducial markers rather than bony anatomy for daily online

  5. Hybrid EEG-fNIRS Asynchronous Brain-Computer Interface for Multiple Motor Tasks.

    Directory of Open Access Journals (Sweden)

    Alessio Paolo Buccino

    Full Text Available Non-invasive Brain-Computer Interfaces (BCI have demonstrated great promise for neuroprosthetics and assistive devices. Here we aim to investigate methods to combine Electroencephalography (EEG and functional Near-Infrared Spectroscopy (fNIRS in an asynchronous Sensory Motor rhythm (SMR-based BCI. We attempted to classify 4 different executed movements, namely, Right-Arm-Left-Arm-Right-Hand-Left-Hand tasks. Previous studies demonstrated the benefit of EEG-fNIRS combination. However, since normally fNIRS hemodynamic response shows a long delay, we investigated new features, involving slope indicators, in order to immediately detect changes in the signals. Moreover, Common Spatial Patterns (CSPs have been applied to both EEG and fNIRS signals. 15 healthy subjects took part in the experiments and since 25 trials per class were available, CSPs have been regularized with information from the entire population of participants and optimized using genetic algorithms. The different features have been compared in terms of performance and the dynamic accuracy over trials shows that the introduced methods diminish the fNIRS delay in the detection of changes.

  6. Resistance to change and resurgence in humans engaging in a computer task.

    Science.gov (United States)

    Kuroda, Toshikazu; Cançado, Carlos R X; Podlesnik, Christopher A

    2016-04-01

    The relation between persistence, as measured by resistance to change, and resurgence has been examined with nonhuman animals but not systematically with humans. The present study examined persistence and resurgence with undergraduate students engaging in a computer task for points exchangeable for money. In Phase 1, a target response was maintained on a multiple variable-interval (VI) 15-s (Rich) VI 60-s (Lean) schedule of reinforcement. In Phase 2, the target response was extinguished while an alternative response was reinforced at equal rates in both schedule components. In Phase 3, the target and the alternative responses were extinguished. In an additional test of persistence (Phase 4), target responding was reestablished as in Phase 1 and then disrupted by access to videos in both schedule components. In Phases 2 and 4, target responding was more persistent in the Rich than in the Lean component. Also, resurgence generally was greater in the Rich than in the Lean component in Phase 3. The present findings with humans extend the generality of those obtained with nonhuman animals showing that higher reinforcement rates produce both greater persistence and resurgence, and suggest that common processes underlie response persistence and relapse. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Quality assurance for computed-tomography simulators and the computed-tomography-simulation process: Report of the AAPM Radiation Therapy Committee Task Group No. 66

    International Nuclear Information System (INIS)

    Mutic, Sasa; Palta, Jatinder R.; Butker, Elizabeth K.; Das, Indra J.; Huq, M. Saiful; Loo, Leh-Nien Dick; Salter, Bill J.; McCollough, Cynthia H.; Van Dyk, Jacob

    2003-01-01

    This document presents recommendations of the American Association of Physicists in Medicine (AAPM) for quality assurance of computed-tomography- (CT) simulators and CT-simulation process. This report was prepared by Task Group No. 66 of the AAPM Radiation Therapy Committee. It was approved by the Radiation Therapy Committee and by the AAPM Science Council

  8. Teaching Sustainable Process Design Using 12 Systematic Computer-Aided Tasks

    DEFF Research Database (Denmark)

    Babi, Deenesh K.

    2015-01-01

    (tasks 4-7) and then sizing, costing and economic analysis of the designed process (tasks 8-9). This produces a base case design. In tasks 10-12, the student explores opportunities for heat and/or mass integration, followed by a sustainability analysis, in order to evaluate the base case design and set......In this paper a task-based approach for teaching (sustainable) process design to students pursuing a degree in chemical and biochemical engineering is presented. In tasks 1-3 the student makes design decisions for product and process selection followed by simple and rigorous model simulations...... targets for further improvement. Finally, a process optimization problem is formulated and solved to obtain the more sustainable process design. The 12 tasks are explained in terms of input and output of each task and examples of application of this approach in an MSclevel course are reported....

  9. The Effect of Motor Difficulty on the Acquisition of a Computer Task: A Comparison between Young and Older Adults

    Science.gov (United States)

    Fezzani, K.; Albinet, C.; Thon, B.; Marquie, J. -C.

    2010-01-01

    The present study investigated the extent to which the impact of motor difficulty on the acquisition of a computer task varies as a function of age. Fourteen young and 14 older participants performed 352 sequences of 10 serial pointing movements with a wireless pen on a digitiser tablet. A conditional probabilistic structure governed the…

  10. Bridges to Swaziland: Using Task-Based Learning and Computer-Mediated Instruction to Improve English Language Teaching and Learning

    Science.gov (United States)

    Pierson, Susan Jacques

    2015-01-01

    One way to provide high quality instruction for underserved English Language Learners around the world is to combine Task-Based English Language Learning with Computer- Assisted Instruction. As part of an ongoing project, "Bridges to Swaziland," these approaches have been implemented in a determined effort to improve the ESL program for…

  11. Learning Style and Task Performance in Synchronous Computer-Mediated Communication: A Case Study of Iranian EFL Learners

    Science.gov (United States)

    Hedayati, Mohsen; Foomani, Elham Mohammadi

    2015-01-01

    The study reported here explores whether English as a foreign Language (EFL) learners' preferred ways of learning (i.e., learning styles) affect their task performance in computer-mediated communication (CMC). As Ellis (2010) points out, while the increasing use of different sorts of technology is witnessed in language learning contexts, it is…

  12. Reasoning Abilities in Primary School: A Pilot Study on Poor Achievers vs. Normal Achievers in Computer Game Tasks

    Science.gov (United States)

    Dagnino, Francesca Maria; Ballauri, Margherita; Benigno, Vincenza; Caponetto, Ilaria; Pesenti, Elia

    2013-01-01

    This paper presents the results of preliminary research on the assessment of reasoning abilities in primary school poor achievers vs. normal achievers using computer game tasks. Subjects were evaluated by means of cognitive assessment on logical abilities and academic skills. The aim of this study is to better understand the relationship between…

  13. Waiting is the hardest part: comparison of two computational strategies for performing a compelled-response task

    Directory of Open Access Journals (Sweden)

    Emilio Salinas

    2010-12-01

    Full Text Available The neural basis of choice behavior is commonly investigated with tasks in which a subject analyzes a stimulus and reports his or her perceptual experience with an appropriate motor action. We recently developed a novel task, the compelled-saccade task, with which the influence of the sensory information on the subject's choice can be tracked through time with millisecond resolution, thus providing a new tool for correlating neuronal activity and behavior. This paradigm has a crucial feature: the signal that instructs the subject to make an eye movement is given before the cue that indicates which of two possible choices is the correct one. Previously, we found that psychophysical performance in this task could be accurately replicated by a model in which two developing oculomotor plans race to a threshold and the incoming perceptual information differentially accelerates their trajectories toward it. However, the task design suggests an alternative mechanism: instead of modifying an ongoing oculomotor plan on the fly as the sensory information becomes available, the subject could try to wait, withholding the oculomotor response until the sensory cue is revealed. Here, we use computer simulations to explore and compare the performance of these two types of models. We find that both reproduce the main features of the psychophysical data in the compelled-saccade task, but they give rise to distinct behavioral and neurophysiological predictions. Although, superficially, the waiting model is intuitively appealing, it is ultimately inconsistent with experimental results from this and other tasks.

  14. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    Science.gov (United States)

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  15. A new algorithm for histopathological diagnosis of periprosthetic infection using CD15 focus score and computer program CD15 Quantifier

    Directory of Open Access Journals (Sweden)

    V. Krenn

    2015-01-01

    Full Text Available Introduction. A simple microscopic diagnostic quantification system for neutrophile granulocytes (NG was developed evaluating a single focal point (CD15 focus score which enables the detection of bacterial infection in SLIM (synoviallike interface membrane Additionally a diagnostic algorithm is proposed how to use the CD15 focus score and the quantification software (CD15 Quantifier. Methods. 91 SLIM removed during revision surgery for histopathological diagnosis (hip; n=59 and knee; n=32 underwent histopathological classification according to the SLIM-consensus classification. NG where identified immunohistochemically by means of a CD15-specific monoclonal antibody exhibiting an intense granular cytoplasmic staining pattern. This pattern is different from CD15 expression in macrophages showing a pale and homogenous expression in mononuclear cells. The quantitative evaluation of CD15-positive neutrophils granulocytes (CD15NG used the principle of maximum focal infiltration (focus together with an assessment of a single focal point (approximately 0.3 mm2. This immunohistochemical data made it possible to develop CD15 Quantifier software which automatically quantifies CD15NG. Results. SLIM-cases with positive microbiological diagnosis (n=47 have significantly (p<0.001, Mann-Whitney U test more CD15NG/focal point than cases with negative microbiological diagnosis (n=44. 50 CD15NG/focal point were identified as the optimum threshold when diagnosing infection of periprosthetic joints using the CD15 focus score. If the microbiological findings are used as a ‘gold standard’ the diagnostic sensitivity is 0.83, specificity is 0.864. (PPV: 0.87; NPV: 0.83; accuracy 0.846; AUC: 0.878. The evaluation findings for the preparations using the CD15 Quantifier (n=31 deviated in an average of 12 cells from the histopathological evaluation findings (CD15focus score. From a cell-count greater 62 CD15 Quantifier needs on average 32 seconds less than the

  16. A computational study of whole-brain connectivity in resting state and task fMRI

    Science.gov (United States)

    Goparaju, Balaji; Rana, Kunjan D.; Calabro, Finnegan J.; Vaina, Lucia Maria

    2014-01-01

    Background We compared the functional brain connectivity produced during resting-state in which subjects were not actively engaged in a task with that produced while they actively performed a visual motion task (task-state). Material/Methods In this paper we employed graph-theoretical measures and network statistics in novel ways to compare, in the same group of human subjects, functional brain connectivity during resting-state fMRI with brain connectivity during performance of a high level visual task. We performed a whole-brain connectivity analysis to compare network statistics in resting and task states among anatomically defined Brodmann areas to investigate how brain networks spanning the cortex changed when subjects were engaged in task performance. Results In the resting state, we found strong connectivity among the posterior cingulate cortex (PCC), precuneus, medial prefrontal cortex (MPFC), lateral parietal cortex, and hippocampal formation, consistent with previous reports of the default mode network (DMN). The connections among these areas were strengthened while subjects actively performed an event-related visual motion task, indicating a continued and strong engagement of the DMN during task processing. Regional measures such as degree (number of connections) and betweenness centrality (number of shortest paths), showed that task performance induces stronger inter-regional connections, leading to a denser processing network, but that this does not imply a more efficient system as shown by the integration measures such as path length and global efficiency, and from global measures such as small-worldness. Conclusions In spite of the maintenance of connectivity and the “hub-like” behavior of areas, our results suggest that the network paths may be rerouted when performing the task condition. PMID:24947491

  17. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals

    Science.gov (United States)

    Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.

  18. Motivation and performance within a collaborative computer-based modeling task: Relations between student's achievement goal orientation, self-efficiacy, cognitive processing and achievement.

    NARCIS (Netherlands)

    Sins, Patrick H.M.; van Joolingen, Wouter; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In

  19. Motivation and performance within a collaborative computer-based modeling task: Relations between students' achievement goal orientation, self-efficacy, cognitive processing and achievement

    NARCIS (Netherlands)

    Sins, P.H.M.; van Joolingen, W.R.; Savelsbergh, E.R.; van Hout-Wolters, B.H.A.M.

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In

  20. Quantifying vertical stress transmission and compaction-induced soil structure using sensor mat and X-ray computed tomography

    DEFF Research Database (Denmark)

    Naveed, Muhammad; Schjønning, Per; Keller, Thomas

    2016-01-01

    tillage. In this study, partially confined uniaxial compression tests were carried out on intact topsoil columns placed on subsoil columns. Two methods were employed for estimation of stress transmission in soil: (i) soil deformation patterns were quantified using X-ray CT and converted to stress......Accurate estimation of stress transmission in soil and quantification of compaction-induced soil pore structure is important for efficient soil use and management. Continuum mechanics have so far mostly been applied for agricultural soils, even if topsoil structure is aggregated due to regular...... distributions, and (ii) a tactile sensor mat was employed for measuring stresses at the interface of the topsoil and subsoil columns. The resulting soil pore structure under applied stresses was quantified using X-ray CT and by air-permeability measurements. In topsoil discrete stress transmission patterns were...

  1. Developing and testing a computer vision method to quantify 3D movements of bottom-set gillnets on the seabed

    DEFF Research Database (Denmark)

    Savina, Esther; Krag, Ludvig Ahm; Madsen, Niels

    2018-01-01

    Gillnets are one of the most widely used fishing gears, but there is limited knowledge about their habitat effects, partly due to the lack of methodology to quantify such effects. A stereo imaging method was identified and adapted to quantify the dynamic behaviour of gillnets in-situ. Two cameras...... gillnets deployed in sandy habitats in the Danish coastal plaice fishery were assessed. The direct physical disruption of the seabed was minimal as the leadline was not penetrating into the seabed. Direct damage to the benthos could however originate from the sweeping movements of the nets, which were...... the general perception is that heavy gears are more destructive to the habitat, light nets were moving significantly more than heavy ones. The established methodology could be further applied to assess gear dynamic behaviour in situ of other static gears....

  2. Primary or secondary tasks? Dual-task interference between cyclist hazard perception and cadence control using cross-modal sensory aids with rider assistance bike computers.

    Science.gov (United States)

    Yang, Chao-Yang; Wu, Cheng-Tse

    2017-03-01

    This research investigated the risks involved in bicycle riding while using various sensory modalities to deliver training information. To understand the risks associated with using bike computers, this study evaluated hazard perception performance through lab-based simulations of authentic riding conditions. Analysing hazard sensitivity (d') of signal detection theory, the rider's response time, and eye glances provided insights into the risks of using bike computers. In this study, 30 participants were tested with eight hazard perception tasks while they maintained a cadence of 60 ± 5 RPM and used bike computers with different sensory displays, namely visual, auditory, and tactile feedback signals. The results indicated that synchronously using different sense organs to receive cadence feedback significantly affects hazard perception performance; direct visual information leads to the worst rider distraction, with a mean sensitivity to hazards (d') of -1.03. For systems with multiple interacting sensory aids, auditory aids were found to result in the greatest reduction in sensitivity to hazards (d' mean = -0.57), whereas tactile sensory aids reduced the degree of rider distraction (d' mean = -0.23). Our work complements existing work in this domain by advancing the understanding of how to design devices that deliver information subtly, thereby preventing disruption of a rider's perception of road hazards. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Computer-mediated communication and time pressure induce higher cardiovascular responses in the preparatory and execution phases of cooperative tasks.

    Science.gov (United States)

    Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia

    2010-11-01

    The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.

  4. A multilevel modeling approach to examining individual differences in skill acquisition for a computer-based task.

    Science.gov (United States)

    Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph

    2007-06-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.

  5. The Nicest way to migrate your Windows computer ( The Windows 2000 Migration Task Force)

    CERN Document Server

    2001-01-01

    With Windows 2000, CERN users will discover a more stable and reliable working environment and will have access to all the latest applications. The Windows 2000 Migration Task Force - a representative from each division.

  6. Computational Modeling of Human Multiple-Task Performance and Mental Workload

    National Research Council Canada - National Science Library

    Meyer, David

    2004-01-01

    ... (Executive-Process/Interactive Control) was developed, applied to several types of tasks to accurately represent human performance, and inspired to collection of new data that cast new light on the scientific analysis of key phenomena...

  7. Towards Better Computational Models of the Balance Scale Task: A Reply to Shultz and Takane

    Science.gov (United States)

    van der Maas, Han L. J.; Quinlan, Philip T.; Jansen, Brenda R. J.

    2007-01-01

    In contrast to Shultz and Takane [Shultz, T.R., & Takane, Y. (2007). Rule following and rule use in the balance-scale task. "Cognition", in press, doi:10.1016/j.cognition.2006.12.004.] we do not accept that the traditional Rule Assessment Method (RAM) of scoring responses on the balance scale task has advantages over latent class analysis (LCA):…

  8. Analysis of the priority of anatomic structures according to the diagnostic task in cone-beam computed tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jin Woo [Dept. of Oral and Maxillofacial Radiology, Dankook University College of Dentistry, Chunan (Korea, Republic of)

    2016-12-15

    This study was designed to evaluate differences in the required visibility of anatomic structures according to the diagnostic tasks of implant planning and periapical diagnosis. Images of a real skull phantom were acquired under 24 combinations of different exposure conditions in a cone-beam computed tomography scanner (60, 70, 80, 90, 100, and 110 kV and 4, 6, 8, and 10 mA). Five radiologists evaluated the visibility of anatomic structures and the image quality for diagnostic tasks using a 6-point scale. The visibility of the periodontal ligament space showed the closest association with the ability to use an image for periapical diagnosis in both jaws. The visibility of the sinus floor and canal wall showed the closest association with the ability to use an image for implant planning. Variations in tube voltage were associated with significant differences in image quality for all diagnostic tasks. However, tube current did not show significant associations with the ability to use an image for implant planning. The required visibility of anatomic structures varied depending on the diagnostic task. Tube voltage was a more important exposure parameter for image quality than tube current. Different settings should be used for optimization and image quality evaluation depending on the diagnostic task.

  9. Quantifying Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Quantifying Matter explains how scientists learned to measure matter and quantify some of its most fascinating and useful properties. It presents many of the most important intellectual achievements and technical developments that led to the scientific interpretation of substance. Complete with full-color photographs, this exciting new volume describes the basic characteristics and properties of matter. Chapters include:. -Exploring the Nature of Matter. -The Origin of Matter. -The Search for Substance. -Quantifying Matter During the Scientific Revolution. -Understanding Matter's Electromagnet

  10. Storyboarding: A Method for Bootstrapping the Design of Computer-Based Educational Tasks

    Science.gov (United States)

    Jones, Ian

    2008-01-01

    There has been a recent call for the use of more systematic thought experiments when investigating learning. This paper presents a storyboarding method for capturing and sharing initial ideas and their evolution in the design of a mathematics learning task. The storyboards produced can be considered as "virtual data" created by thought experiments…

  11. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  12. The impact of goal-oriented task design on neurofeedback learning for brain-computer interface control.

    Science.gov (United States)

    McWhinney, S R; Tremblay, A; Boe, S G; Bardouille, T

    2018-02-01

    Neurofeedback training teaches individuals to modulate brain activity by providing real-time feedback and can be used for brain-computer interface control. The present study aimed to optimize training by maximizing engagement through goal-oriented task design. Participants were shown either a visual display or a robot, where each was manipulated using motor imagery (MI)-related electroencephalography signals. Those with the robot were instructed to quickly navigate grid spaces, as the potential for goal-oriented design to strengthen learning was central to our investigation. Both groups were hypothesized to show increased magnitude of these signals across 10 sessions, with the greatest gains being seen in those navigating the robot due to increased engagement. Participants demonstrated the predicted increase in magnitude, with no differentiation between hemispheres. Participants navigating the robot showed stronger left-hand MI increases than those with the computer display. This is likely due to success being reliant on maintaining strong MI-related signals. While older participants showed stronger signals in early sessions, this trend later reversed, suggesting greater natural proficiency but reduced flexibility. These results demonstrate capacity for modulating neurofeedback using MI over a series of training sessions, using tasks of varied design. Importantly, the more goal-oriented robot control task resulted in greater improvements.

  13. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L. [Oak Ridge National Lab., TN (United States); Sartori, E. [OCDE/OECD NEA Data Bank, Issy-les-Moulineaux (France); Viedma, L.G. de [Consejo de Seguridad Nuclear, Madrid (Spain)

    1997-06-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee`s Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community`s computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management.

  14. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    International Nuclear Information System (INIS)

    Kirk, B.L.; Sartori, E.; Viedma, L.G. de

    1997-01-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee's Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community's computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management

  15. Single-cell mechanics--An experimental-computational method for quantifying the membrane-cytoskeleton elasticity of cells.

    Science.gov (United States)

    Tartibi, M; Liu, Y X; Liu, G-Y; Komvopoulos, K

    2015-11-01

    The membrane-cytoskeleton system plays a major role in cell adhesion, growth, migration, and differentiation. F-actin filaments, cross-linkers, binding proteins that bundle F-actin filaments to form the actin cytoskeleton, and integrins that connect the actin cytoskeleton network to the cell plasma membrane and extracellular matrix are major cytoskeleton constituents. Thus, the cell cytoskeleton is a complex composite that can assume different shapes. Atomic force microscopy (AFM)-based techniques have been used to measure cytoskeleton material properties without much attention to cell shape. A recently developed surface chemical patterning method for long-term single-cell culture was used to seed individual cells on circular patterns. A continuum-based cell model, which uses as input the force-displacement response obtained with a modified AFM setup and relates the membrane-cytoskeleton elastic behavior to the cell geometry, while treating all other subcellular components suspended in the cytoplasmic liquid (gel) as an incompressible fluid, is presented and validated by experimental results. The developed analytical-experimental methodology establishes a framework for quantifying the membrane-cytoskeleton elasticity of live cells. This capability may have immense implications in cell biology, particularly in studies seeking to establish correlations between membrane-cytoskeleton elasticity and cell disease, mortality, differentiation, and migration, and provide insight into cell infiltration through nonwoven fibrous scaffolds. The present method can be further extended to analyze membrane-cytoskeleton viscoelasticity, examine the role of other subcellular components (e.g., nucleus envelope) in cell elasticity, and elucidate the effects of mechanical stimuli on cell differentiation and motility. This is the first study to decouple the membrane-cytoskeleton elasticity from cell stiffness and introduce an effective approach for measuring the elastic modulus. The

  16. Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks

    Czech Academy of Sciences Publication Activity Database

    Kainen, P.C.; Kůrková, Věra; Sanguineti, M.

    2012-01-01

    Roč. 58, č. 2 (2012), s. 1203-1214 ISSN 0018-9448 R&D Projects: GA MŠk(CZ) ME10023; GA ČR GA201/08/1744; GA ČR GAP202/11/1368 Grant - others:CNR-AV ČR(CZ-IT) Project 2010–2012 Complexity of Neural -Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : dictionary-based computational models * high-dimensional approximation and optimization * model complexity * polynomial upper bounds Subject RIV: IN - Informatics, Computer Science Impact factor: 2.621, year: 2012

  17. Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0

    Directory of Open Access Journals (Sweden)

    Melanie Swan

    2012-11-01

    Full Text Available The number of devices on the Internet exceeded the number of people on the Internet in 2008, and is estimated to reach 50 billion in 2020. A wide-ranging Internet of Things (IOT ecosystem is emerging to support the process of connecting real-world objects like buildings, roads, household appliances, and human bodies to the Internet via sensors and microprocessor chips that record and transmit data such as sound waves, temperature, movement, and other variables. The explosion in Internet-connected sensors means that new classes of technical capability and application are being created. More granular 24/7 quantified monitoring is leading to a deeper understanding of the internal and external worlds encountered by humans. New data literacy behaviors such as correlation assessment, anomaly detection, and high-frequency data processing are developing as humans adapt to the different kinds of data flows enabled by the IOT. The IOT ecosystem has four critical functional steps: data creation, information generation, meaning-making, and action-taking. This paper provides a comprehensive review of the current and rapidly emerging ecosystem of the Internet of Things (IOT.

  18. Task-Oriented Training with Computer Games for People with Rheumatoid Arthritis or Hand Osteoarthritis: A Feasibility Randomized Controlled Trial.

    Science.gov (United States)

    Srikesavan, Cynthia Swarnalatha; Shay, Barbara; Szturm, Tony

    2016-09-13

    To examine the feasibility of a clinical trial on a novel, home-based task-oriented training with conventional hand exercises in people with rheumatoid arthritis or hand osteoarthritis. To explore the experiences of participants who completed their respective home exercise programmes. Thirty volunteer participants aged between 30 and 60 years and diagnosed with rheumatoid arthritis or hand osteoarthritis were proposed for a single-center, assessor-blinded, randomized controlled trial ( ClinicalTrials.gov : NCT01635582). Participants received task-oriented training with interactive computer games and objects of daily life or finger mobility and strengthening exercises. Both programmes were home based and were done four sessions per week with 20 minutes each session for 6 weeks. Major feasibility outcomes were number of volunteers screened, randomized, and retained; completion of blinded assessments, exercise training, and home exercise sessions; equipment and data management; and clinical outcomes of hand function. Reaching the recruitment target in 18 months and achieving exercise compliance >80% were set as success criteria. Concurrent with the trial, focus group interviews explored experiences of those participants who completed their respective programmes. After trial initiation, revisions in inclusion criteria were required to promote recruitment. A total of 17 participants were randomized and 15 were retained. Completion of assessments, exercise training, and home exercise sessions; equipment and data collection and management demonstrated excellent feasibility. Both groups improved in hand function outcomes and exercise compliance was above 85%. Participants perceived both programmes as appropriate and acceptable. Participants who completed task-oriented training also agreed that playing different computer games was enjoyable, engaging, and motivating. Findings demonstrate initial evidence on recruitment, feasibility of trial procedures, and acceptability of

  19. Task-oriented comparison of power spectral density estimation methods for quantifying acoustic attenuation in diagnostic ultrasound using a reference phantom method.

    Science.gov (United States)

    Rosado-Mendez, Ivan M; Nam, Kibo; Hall, Timothy J; Zagzebski, James A

    2013-07-01

    Reported here is a phantom-based comparison of methods for determining the power spectral density (PSD) of ultrasound backscattered signals. Those power spectral density values are then used to estimate parameters describing α(f), the frequency dependence of the acoustic attenuation coefficient. Phantoms were scanned with a clinical system equipped with a research interface to obtain radiofrequency echo data. Attenuation, modeled as a power law α(f)= α0 f (β), was estimated using a reference phantom method. The power spectral density was estimated using the short-time Fourier transform (STFT), Welch's periodogram, and Thomson's multitaper technique, and performance was analyzed when limiting the size of the parameter-estimation region. Errors were quantified by the bias and standard deviation of the α0 and β estimates, and by the overall power-law fit error (FE). For parameter estimation regions larger than ~34 pulse lengths (~1 cm for this experiment), an overall power-law FE of 4% was achieved with all spectral estimation methods. With smaller parameter estimation regions as in parametric image formation, the bias and standard deviation of the α0 and β estimates depended on the size of the parameter estimation region. Here, the multitaper method reduced the standard deviation of the α0 and β estimates compared with those using the other techniques. The results provide guidance for choosing methods for estimating the power spectral density in quantitative ultrasound methods.

  20. Quantifying morphological parameters of the terminal branching units in a mouse lung by phase contrast synchrotron radiation computed tomography.

    Directory of Open Access Journals (Sweden)

    Jeongeun Hwang

    Full Text Available An effective technique of phase contrast synchrotron radiation computed tomography was established for the quantitative analysis of the microstructures in the respiratory zone of a mouse lung. Heitzman's method was adopted for the whole-lung sample preparation, and Canny's edge detector was used for locating the air-tissue boundaries. This technique revealed detailed morphology of the respiratory zone components, including terminal bronchioles and alveolar sacs, with sufficiently high resolution of 1.74 µm isotropic voxel size. The technique enabled visual inspection of the respiratory zone components and comprehension of their relative positions in three dimensions. To check the method's feasibility for quantitative imaging, morphological parameters such as diameter, surface area and volume were measured and analyzed for sixteen randomly selected terminal branching units, each consisting of a terminal bronchiole and a pair of succeeding alveolar sacs. The four types of asymmetry ratios concerning alveolar sac mouth diameter, alveolar sac surface area, and alveolar sac volume are measured. This is the first ever finding of the asymmetry ratio for the terminal bronchioles and alveolar sacs, and it is noteworthy that an appreciable degree of branching asymmetry was observed among the alveolar sacs at the terminal end of the airway tree, despite the number of samples was small yet. The series of efficient techniques developed and confirmed in this study, from sample preparation to quantification, is expected to contribute to a wider and exacter application of phase contrast synchrotron radiation computed tomography to a variety of studies.

  1. Design and preliminary evaluation of the FINGER rehabilitation robot: controlling challenge and quantifying finger individuation during musical computer game play.

    Science.gov (United States)

    Taheri, Hossein; Rowe, Justin B; Gardner, David; Chan, Vicki; Gray, Kyle; Bower, Curtis; Reinkensmeyer, David J; Wolbrecht, Eric T

    2014-02-04

    This paper describes the design and preliminary testing of FINGER (Finger Individuating Grasp Exercise Robot), a device for assisting in finger rehabilitation after neurologic injury. We developed FINGER to assist stroke patients in moving their fingers individually in a naturalistic curling motion while playing a game similar to Guitar Hero. The goal was to make FINGER capable of assisting with motions where precise timing is important. FINGER consists of a pair of stacked single degree-of-freedom 8-bar mechanisms, one for the index and one for the middle finger. Each 8-bar mechanism was designed to control the angle and position of the proximal phalanx and the position of the middle phalanx. Target positions for the mechanism optimization were determined from trajectory data collected from 7 healthy subjects using color-based motion capture. The resulting robotic device was built to accommodate multiple finger sizes and finger-to-finger widths. For initial evaluation, we asked individuals with a stroke (n = 16) and without impairment (n = 4) to play a game similar to Guitar Hero while connected to FINGER. Precision design, low friction bearings, and separate high speed linear actuators allowed FINGER to individually actuate the fingers with a high bandwidth of control (-3 dB at approximately 8 Hz). During the tests, we were able to modulate the subject's success rate at the game by automatically adjusting the controller gains of FINGER. We also used FINGER to measure subjects' effort and finger individuation while playing the game. Test results demonstrate the ability of FINGER to motivate subjects with an engaging game environment that challenges individuated control of the fingers, automatically control assistance levels, and quantify finger individuation after stroke.

  2. Lessons Learned From the Development and Parameterization of a Computer Simulation Model to Evaluate Task Modification for Health Care Providers.

    Science.gov (United States)

    Kasaie, Parastu; David Kelton, W; Ancona, Rachel M; Ward, Michael J; Froehle, Craig M; Lyons, Michael S

    2018-02-01

    Computer simulation is a highly advantageous method for understanding and improving health care operations with a wide variety of possible applications. Most computer simulation studies in emergency medicine have sought to improve allocation of resources to meet demand or to assess the impact of hospital and other system policies on emergency department (ED) throughput. These models have enabled essential discoveries that can be used to improve the general structure and functioning of EDs. Theoretically, computer simulation could also be used to examine the impact of adding or modifying specific provider tasks. Doing so involves a number of unique considerations, particularly in the complex environment of acute care settings. In this paper, we describe conceptual advances and lessons learned during the design, parameterization, and validation of a computer simulation model constructed to evaluate changes in ED provider activity. We illustrate these concepts using examples from a study focused on the operational effects of HIV screening implementation in the ED. Presentation of our experience should emphasize the potential for application of computer simulation to study changes in health care provider activity and facilitate the progress of future investigators in this field. © 2017 by the Society for Academic Emergency Medicine.

  3. Description of the tasks of control room operators in German nuclear power plants and support possibilities by advanced computer systems

    International Nuclear Information System (INIS)

    Buettner, W.E.

    1984-01-01

    In course of the development of nuclear power plants the instrumentation and control systems and the information in the control room have been increasing substantially. With this background it is described which operator tasks might be supported by advanced computer aid systems with main emphasis to safety related information and diagnose facilities. Nevertheless, some of this systems under development may be helpful for normal operation modes too. As far as possible recommendations for the realization and test of such systems are made. (orig.) [de

  4. Does quantifying epicardial and intrathoracic fat with noncontrast computed tomography improve risk stratification beyond calcium scoring alone?

    Science.gov (United States)

    Forouzandeh, Farshad; Chang, Su Min; Muhyieddeen, Kamil; Zaid, Rashid R; Trevino, Alejandro R; Xu, Jiaqiong; Nabi, Faisal; Mahmarian, John J

    2013-01-01

    Noncontrast cardiac computed tomography allows calculation of coronary artery calcium score (CACS) and measurement of epicardial adipose tissue (EATv) and intrathoracic fat (ITFv) volumes. It is unclear whether fat volume information contributes to risk stratification. Cardiac computed tomography was performed in 760 consecutive patients with acute chest pain admitted thorough the emergency department. None had prior coronary artery disease. CACS was calculated using the Agatston method. EATv and ITFv were semiautomatically calculated. Median patient follow-up was 3.3 years. Mean patient age was 54.4±13.7 years and Framingham risk score 8.2±8.2. The 45 patients (5.9%) with major acute cardiac events (MACE) were older (64.8±13.9 versus 53.7±13.4 years), more frequently male (60% versus 40%), and had a higher median Framingham risk score (16 versus 4) and CACS (268 versus 0) versus those without events (all PEATv (154 versus 116 mL) and ITFv (330 versus 223 mL), and a higher prevalence of EATv >125 mL (67% versus 44%) and ITFv >250 mL (64% versus 42%) (all PEATv, and ITFv were all independently associated with MACE. CACS was associated with MACE after adjustment for fat volumes (PEATv and ITFv improved the risk model only in patients with CACS >400. CACS and fat volumes are independently associated with MACE in acute chest pain patients and beyond that provided by clinical information alone. Although fat volumes may add prognostic value in patients with CACS >400, CACS is most strongly correlated with outcome.

  5. A Medical Research and Evaluation Facility (MREF) and Studies Supporting the Medical Chemical Defense Program: Task 95-39: Methods Development and Validation of Two Mouse Bioassays for Use in Quantifying Botulinum Toxins (A, B, C, D and E) and Toxin Antibody Titers

    National Research Council Canada - National Science Library

    Olson, Carl

    1997-01-01

    Ths task was conducted for the U.S. Army Medical Materiel Development Activity (USAMMDA) to validate two mouse bioassays for quantify botulinum toxin potency and neutralizing antibodies to botulimun toxins...

  6. Quantifying error of lidar and sodar Doppler beam swinging measurements of wind turbine wakes using computational fluid dynamics

    Science.gov (United States)

    Lundquist, J. K.; Churchfield, M. J.; Lee, S.; Clifton, A.

    2015-02-01

    Wind-profiling lidars are now regularly used in boundary-layer meteorology and in applications such as wind energy and air quality. Lidar wind profilers exploit the Doppler shift of laser light backscattered from particulates carried by the wind to measure a line-of-sight (LOS) velocity. The Doppler beam swinging (DBS) technique, used by many commercial systems, considers measurements of this LOS velocity in multiple radial directions in order to estimate horizontal and vertical winds. The method relies on the assumption of homogeneous flow across the region sampled by the beams. Using such a system in inhomogeneous flow, such as wind turbine wakes or complex terrain, will result in errors. To quantify the errors expected from such violation of the assumption of horizontal homogeneity, we simulate inhomogeneous flow in the atmospheric boundary layer, notably stably stratified flow past a wind turbine, with a mean wind speed of 6.5 m s-1 at the turbine hub-height of 80 m. This slightly stable case results in 15° of wind direction change across the turbine rotor disk. The resulting flow field is sampled in the same fashion that a lidar samples the atmosphere with the DBS approach, including the lidar range weighting function, enabling quantification of the error in the DBS observations. The observations from the instruments located upwind have small errors, which are ameliorated with time averaging. However, the downwind observations, particularly within the first two rotor diameters downwind from the wind turbine, suffer from errors due to the heterogeneity of the wind turbine wake. Errors in the stream-wise component of the flow approach 30% of the hub-height inflow wind speed close to the rotor disk. Errors in the cross-stream and vertical velocity components are also significant: cross-stream component errors are on the order of 15% of the hub-height inflow wind speed (1.0 m s-1) and errors in the vertical velocity measurement exceed the actual vertical velocity

  7. Motivation and performance within a collaborative computer-based modeling task: Relations between students' achievement goal orientation, self-efficacy, cognitive processing and achievement

    OpenAIRE

    Sins, P.H.M.; van Joolingen, W.R.; Savelsbergh, E.R.; van Hout-Wolters, B.H.A.M.

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of mastery-approach goal orientation, performance-avoidance goal orientation, self-efficacy, and achievement were employed. Students’ cognitive processing was a...

  8. Multimodal Learning Analytics and Education Data Mining: Using Computational Technologies to Measure Complex Learning Tasks

    Science.gov (United States)

    Blikstein, Paulo; Worsley, Marcelo

    2016-01-01

    New high-frequency multimodal data collection technologies and machine learning analysis techniques could offer new insights into learning, especially when students have the opportunity to generate unique, personalized artifacts, such as computer programs, robots, and solutions engineering challenges. To date most of the work on learning analytics…

  9. Computer codes for tasks in the fields of isotope and radiation research

    International Nuclear Information System (INIS)

    Friedrich, K.; Gebhardt, O.

    1978-11-01

    Concise descriptions of computer codes developed for solving problems in the fields of isotope and radiation research at the Zentralinstitut fuer Isotopen- und Strahlenforschung (ZfI) are compiled. In part two the structure of the ZfI program library MABIF is outlined and a complete list of all codes available is given

  10. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    OpenAIRE

    Sergey Nikolaevich Kyazhin; Andrey Vladimirovich Moiseev

    2013-01-01

    The current state of the cloud computing (CC) information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  11. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    Directory of Open Access Journals (Sweden)

    Sergey Nikolaevich Kyazhin

    2013-09-01

    Full Text Available The current state of the cloud computing (CC information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  12. Individual versus Interactive Task-Based Performance through Voice-Based Computer-Mediated Communication

    Science.gov (United States)

    Granena, Gisela

    2016-01-01

    Interaction is a necessary condition for second language (L2) learning (Long, 1980, 1996). Research in computer-mediated communication has shown that interaction opportunities make learners pay attention to form in a variety of ways that promote L2 learning. This research has mostly investigated text-based rather than voice-based interaction. The…

  13. A computational model of focused attention meditation and its transfer to a sustained attention task

    NARCIS (Netherlands)

    Moye, Amir Sep; van Vugt, Marieke; van Vugt, Marieke K; Banks, Adrian P; Kennedy, William G

    2017-01-01

    Although meditation and mindfulness practices are widely discussed and studied more and more in the scientific literature, there is little theory about the cognitive mechanisms that comprise it. Here we begin to develop such a theory by creating a computational cognitive model of a particular type

  14. A computation ANN model for quantifying the global solar radiation: A case study of Al-Aqabah-Jordan

    International Nuclear Information System (INIS)

    Abolgasem, I M; Alghoul, M A; Ruslan, M H; Chan, H Y; Khrit, N G; Sopian, K

    2015-01-01

    In this paper, a computation model is developed to predict the global solar radiation (GSR) in Aqaba city based on the data recorded with association of Artificial Neural Networks (ANN). The data used in this work are global solar radiation (GSR), sunshine duration, maximum and minimum air temperature and relative humidity. These data are available from Jordanian meteorological station over a period of two years. The quality of GSR forecasting is compared by using different Learning Algorithms. The decision of changing the ANN architecture is essentially based on the predicted results to obtain the best ANN model for monthly and seasonal GSR. Different configurations patterns were tested using available observed data. It was found that the model using mainly sunshine duration and air temperature as inputs gives accurate results. The ANN model efficiency and the mean square error values show that the prediction model is accurate. It is found that the effect of the three learning algorithms on the accuracy of the prediction model at the training and testing stages for each time scale is mostly within the same accuracy range. (paper)

  15. Increased epicardial fat volume quantified by 64-multidetector computed tomography is associated with coronary atherosclerosis and totally occlusive lesions

    International Nuclear Information System (INIS)

    Ueno, Koji; Anzai, Toshihisa; Jinzaki, Masahiro

    2009-01-01

    The relationship between the epicardial fat volume measured by 64-slice multidetector computed tomography (MDCT) and the extension and severity of coronary atherosclerosis was investigated. Both MDCT and conventional coronary angiography (CAG) were performed in 71 consecutive patients who presented with effort angina. The volume of epicardial adipose tissue (EAT) was measured by MDCT. The severity of coronary atherosclerosis was assessed by evaluating the extension of coronary plaques in 790 segments using MDCT data, and the percentage diameter stenosis in 995 segments using CAG data. The estimated volume of EAT indexed by body surface area was defined as VEAT. Increased VEAT was associated with advanced age, male sex, degree of metabolic alterations, a history of acute coronary syndrome (ACS) and the presence of total occlusions, and showed positive correlation with the stenosis score r=0.28, P=0.02) and the atheromatosis score (r=0.67, P 3 /m 2 ) to be the strongest independent determinant of the presence of total occlusions odds ratio 4.64. P=0.02). VEAT correlates with the degree of metabolic alterations and coronary atheromatosis. Excessive accumulation of EAT might contribute to the development of ACS and coronary total occlusions. (author)

  16. Computational modelling and analysis of hippocampal-prefrontal information coding during a spatial decision-making task

    Directory of Open Access Journals (Sweden)

    Thomas eJahans-Price

    2014-03-01

    Full Text Available We introduce a computational model describing rat behaviour and the interactions of neural populations processing spatial and mnemonic information during a maze-based, decision-making task. The model integrates sensory input and implements a working memory to inform decisions at a choice point, reproducing rat behavioural data and predicting the occurrence of turn- and memory-dependent activity in neuronal networks supporting task performance. We tested these model predictions using a new software toolbox (Maze Query Language, MQL to analyse activity of medial prefrontal cortical (mPFC and dorsal hippocampal (dCA1 neurons recorded from 6 adult rats during task performance. The firing rates of dCA1 neurons discriminated context (i.e. the direction of the previous turn, whilst a subset of mPFC neurons was selective for current turn direction or context, with some conjunctively encoding both. mPFC turn-selective neurons displayed a ramping of activity on approach to the decision turn and turn-selectivity in mPFC was significantly reduced during error trials. These analyses complement data from neurophysiological recordings in non-human primates indicating that firing rates of cortical neurons correlate with integration of sensory evidence used to inform decision-making.

  17. Classification effects of real and imaginary movement selective attention tasks on a P300-based brain-computer interface

    Science.gov (United States)

    Salvaris, Mathew; Sepulveda, Francisco

    2010-10-01

    Brain-computer interfaces (BCIs) rely on various electroencephalography methodologies that allow the user to convey their desired control to the machine. Common approaches include the use of event-related potentials (ERPs) such as the P300 and modulation of the beta and mu rhythms. All of these methods have their benefits and drawbacks. In this paper, three different selective attention tasks were tested in conjunction with a P300-based protocol (i.e. the standard counting of target stimuli as well as the conduction of real and imaginary movements in sync with the target stimuli). The three tasks were performed by a total of 10 participants, with the majority (7 out of 10) of the participants having never before participated in imaginary movement BCI experiments. Channels and methods used were optimized for the P300 ERP and no sensory-motor rhythms were explicitly used. The classifier used was a simple Fisher's linear discriminant. Results were encouraging, showing that on average the imaginary movement achieved a P300 versus No-P300 classification accuracy of 84.53%. In comparison, mental counting, the standard selective attention task used in previous studies, achieved 78.9% and real movement 90.3%. Furthermore, multiple trial classification results were recorded and compared, with real movement reaching 99.5% accuracy after four trials (12.8 s), imaginary movement reaching 99.5% accuracy after five trials (16 s) and counting reaching 98.2% accuracy after ten trials (32 s).

  18. Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance

    Directory of Open Access Journals (Sweden)

    René Riedl

    2013-01-01

    Full Text Available In today’s society, as computers, the Internet, and mobile phones pervade almost every corner of life, the impact of Information and Communication Technologies (ICT on humans is dramatic. The use of ICT, however, may also have a negative side. Human interaction with technology may lead to notable stress perceptions, a phenomenon referred to as technostress. An investigation of the literature reveals that computer users’ gender has largely been ignored in technostress research, treating users as “gender-neutral.” To close this significant research gap, we conducted a laboratory experiment in which we investigated users’ physiological reaction to the malfunctioning of technology. Based on theories which explain that men, in contrast to women, are more sensitive to “achievement stress,” we predicted that male users would exhibit higher levels of stress than women in cases of system breakdown during the execution of a human-computer interaction task under time pressure, if compared to a breakdown situation without time pressure. Using skin conductance as a stress indicator, the hypothesis was confirmed. Thus, this study shows that user gender is crucial to better understanding the influence of stress factors such as computer malfunctions on physiological stress reactions.

  19. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    Science.gov (United States)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

  20. Neutron radiography and X-ray computed tomography for quantifying weathering and water uptake processes inside porous limestone used as building material

    International Nuclear Information System (INIS)

    Dewanckele, J.; De Kock, T.; Fronteau, G.; Derluyn, H.; Vontobel, P.; Dierick, M.; Van Hoorebeke, L.; Jacobs, P.; Cnudde, V.

    2014-01-01

    Euville and Savonnières limestones were weathered by acid test and this resulted in the formation of a gypsum crust. In order to characterize the crystallization pattern and the evolution of the pore structure below the crust, a combination of high resolution X-ray computed tomography and SEM–EDS was used. A time lapse sequence of the changing pore structure in both stones was obtained and afterwards quantified by using image analysis. The difference in weathering of both stones by the same process could be explained by the underlying microstructure and texture. Because water and moisture play a crucial role in the weathering processes, water uptake in weathered and non-weathered samples was characterized based on neutron radiography. In this way the water uptake was both visualized and quantified in function of the height of the sample and in function of time. In general, the formation of a gypsum crust on limestone slows down the initial water uptake in the materials. - Highlights: • Time lapse sequence in 3D of changing pore structures inside limestone • A combination of X-ray CT, SEM and neutron radiography was used. • Quantification of water content in function of time, height and weathering • Characterization of weathering processes due to gypsum crystallization

  1. Monkeys Wait to Begin a Computer Task when Waiting Makes Their Responses More Effective

    Directory of Open Access Journals (Sweden)

    Theodore A. Evans

    2014-02-01

    Full Text Available Rhesus monkeys (Macaca mulatta and capuchin monkeys (Cebus apella performed a computerized inhibitory control task modeled after an “escalating interest task” from a recent human study (Young, Webb, & Jacobs, 2011. In the original study, which utilized a first-person shooter game, human participants learned to inhibit firing their simulated weapon long enough for the weapon‟s damage potential to grow in effectiveness (up to 10 seconds in duration. In the present study, monkeys earned food pellets for eliminating arrays of target objects using a digital eraser. We assessed whether monkeys could suppress trial-initiating joystick movements long enough for the eraser to grow in size and speed, thereby making their eventual responses more effective. Monkeys of both species learned to inhibit moving the eraser for as long as 10 seconds, and they allowed the eraser to grow larger for successively larger target arrays. This study demonstrates an interesting parallel in behavioral inhibition between human and nonhuman participants and provides a method for future comparative testing of human and nonhuman test groups.

  2. Understanding and Mastering Dynamics in Computing Grids Processing Moldable Tasks with User-Level Overlay

    CERN Document Server

    Moscicki, Jakub Tomasz

    Scientic communities are using a growing number of distributed systems, from lo- cal batch systems, community-specic services and supercomputers to general-purpose, global grid infrastructures. Increasing the research capabilities for science is the raison d'^etre of such infrastructures which provide access to diversied computational, storage and data resources at large scales. Grids are rather chaotic, highly heterogeneous, de- centralized systems where unpredictable workloads, component failures and variability of execution environments are commonplace. Understanding and mastering the hetero- geneity and dynamics of such distributed systems is prohibitive for end users if they are not supported by appropriate methods and tools. The time cost to learn and use the interfaces and idiosyncrasies of dierent distributed environments is another challenge. Obtaining more reliable application execution times and boosting parallel speedup are important to increase the research capabilities of scientic communities. L...

  3. The effect of reinforcer magnitude on probability and delay discounting of experienced outcomes in a computer game task in humans.

    Science.gov (United States)

    Greenhow, Anna K; Hunt, Maree J; Macaskill, Anne C; Harper, David N

    2015-09-01

    Delay and uncertainty of receipt both reduce the subjective value of reinforcers. Delay has a greater impact on the subjective value of smaller reinforcers than of larger ones while the reverse is true for uncertainty. We investigated the effect of reinforcer magnitude on discounting of delayed and uncertain reinforcers using a novel approach: embedding relevant choices within a computer game. Participants made repeated choices between smaller, certain, immediate outcomes and larger, but delayed or uncertain outcomes while experiencing the result of each choice. Participants' choices were generally well described by the hyperbolic discounting function. Smaller numbers of points were discounted more steeply than larger numbers as a function of delay but not probability. The novel experiential choice task described is a promising approach to investigating both delay and probability discounting in humans. © Society for the Experimental Analysis of Behavior.

  4. The effect of physical and psychosocial loads on the trapezius muscle activity during computer keying tasks and rest periods

    DEFF Research Database (Denmark)

    Blangsted, Anne Katrine; Søgaard, Karen; Christensen, Hanne

    2004-01-01

    hand keying task-interspaced with short (30 s) and long (4 min) breaks-in sessions with and without a combination of cognitive and emotional stressors. Adding psychosocial loads to the same physical work did not increase the activity of the trapezius muscle on either the keying or the control side......The overall aim was to investigate the effect of psychosocial loads on trapezius muscle activity during computer keying work and during short and long breaks. In 12 female subjects, surface electromyography (EMG) was recorded bilaterally from the upper trapezius muscle during a standardized one...... resting level. During both short and long breaks, exposure to psychosocial loads also did not increase the activity of the trapezius muscle either on the side of the keying or the control hand. Of note is that during long breaks the muscle activity of the keying side as well as that of the control side...

  5. Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.

    Science.gov (United States)

    Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie

    2016-12-01

    An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.

  6. Evaluation of subjective image quality in relation to diagnostic task for cone beam computed tomography with different fields of view.

    Science.gov (United States)

    Lofthag-Hansen, Sara; Thilander-Klang, Anne; Gröndahl, Kerstin

    2011-11-01

    To evaluate subjective image quality for two diagnostic tasks, periapical diagnosis and implant planning, for cone beam computed tomography (CBCT) using different exposure parameters and fields of view (FOVs). Examinations were performed in posterior part of the jaws on a skull phantom with 3D Accuitomo (FOV 3 cm×4 cm) and 3D Accuitomo FPD (FOVs 4 cm×4 cm and 6 cm×6 cm). All combinations of 60, 65, 70, 75, 80 kV and 2, 4, 6, 8, 10 mA with a rotation of 180° and 360° were used. Dose-area product (DAP) value was determined for each combination. The images were presented, displaying the object in axial, cross-sectional and sagittal views, without scanning data in a random order for each FOV and jaw. Seven observers assessed image quality on a six-point rating scale. Intra-observer agreement was good (κw=0.76) and inter-observer agreement moderate (κw=0.52). Stepwise logistic regression showed kV, mA and diagnostic task to be the most important variables. Periapical diagnosis, regardless jaw, required higher exposure parameters compared to implant planning. Implant planning in the lower jaw required higher exposure parameters compared to upper jaw. Overall ranking of FOVs gave 4 cm×4 cm, 6 cm×6 cm followed by 3 cm×4 cm. This study has shown that exposure parameters should be adjusted according to diagnostic task. For this particular CBCT brand a rotation of 180° gave good subjective image quality, hence a substantial dose reduction can be achieved without loss of diagnostic information. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Evaluation of subjective image quality in relation to diagnostic task for cone beam computed tomography with different fields of view

    International Nuclear Information System (INIS)

    Lofthag-Hansen, Sara; Thilander-Klang, Anne; Groendahl, Kerstin

    2011-01-01

    Aims: To evaluate subjective image quality for two diagnostic tasks, periapical diagnosis and implant planning, for cone beam computed tomography (CBCT) using different exposure parameters and fields of view (FOVs). Materials and methods: Examinations were performed in posterior part of the jaws on a skull phantom with 3D Accuitomo (FOV 3 cm x 4 cm) and 3D Accuitomo FPD (FOVs 4 cm x 4 cm and 6 cm x 6 cm). All combinations of 60, 65, 70, 75, 80 kV and 2, 4, 6, 8, 10 mA with a rotation of 180 o and 360 o were used. Dose-area product (DAP) value was determined for each combination. The images were presented, displaying the object in axial, cross-sectional and sagittal views, without scanning data in a random order for each FOV and jaw. Seven observers assessed image quality on a six-point rating scale. Results: Intra-observer agreement was good (κ w = 0.76) and inter-observer agreement moderate (κ w = 0.52). Stepwise logistic regression showed kV, mA and diagnostic task to be the most important variables. Periapical diagnosis, regardless jaw, required higher exposure parameters compared to implant planning. Implant planning in the lower jaw required higher exposure parameters compared to upper jaw. Overall ranking of FOVs gave 4 cm x 4 cm, 6 cm x 6 cm followed by 3 cm x 4 cm. Conclusions: This study has shown that exposure parameters should be adjusted according to diagnostic task. For this particular CBCT brand a rotation of 180 o gave good subjective image quality, hence a substantial dose reduction can be achieved without loss of diagnostic information.

  8. Quantifying Transmission.

    Science.gov (United States)

    Woolhouse, Mark

    2017-07-01

    Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.

  9. A binary motor imagery tasks based brain-computer interface for two-dimensional movement control

    Science.gov (United States)

    Xia, Bin; Cao, Lei; Maysam, Oladazimi; Li, Jie; Xie, Hong; Su, Caixia; Birbaumer, Niels

    2017-12-01

    Objective. Two-dimensional movement control is a popular issue in brain-computer interface (BCI) research and has many applications in the real world. In this paper, we introduce a combined control strategy to a binary class-based BCI system that allows the user to move a cursor in a two-dimensional (2D) plane. Users focus on a single moving vector to control 2D movement instead of controlling vertical and horizontal movement separately. Approach. Five participants took part in a fixed-target experiment and random-target experiment to verify the effectiveness of the combination control strategy under the fixed and random routine conditions. Both experiments were performed in a virtual 2D dimensional environment and visual feedback was provided on the screen. Main results. The five participants achieved an average hit rate of 98.9% and 99.4% for the fixed-target experiment and the random-target experiment, respectively. Significance. The results demonstrate that participants could move the cursor in the 2D plane effectively. The proposed control strategy is based only on a basic two-motor imagery BCI, which enables more people to use it in real-life applications.

  10. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    Science.gov (United States)

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements. © 2011 IEEE

  11. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    Science.gov (United States)

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. A language for data-parallel and task parallel programming dedicated to multi-SIMD computers. Contributions to hydrodynamic simulation with lattice gases

    International Nuclear Information System (INIS)

    Pic, Marc Michel

    1995-01-01

    Parallel programming covers task-parallelism and data-parallelism. Many problems need both parallelisms. Multi-SIMD computers allow hierarchical approach of these parallelisms. The T++ language, based on C++, is dedicated to exploit Multi-SIMD computers using a programming paradigm which is an extension of array-programming to tasks managing. Our language introduced array of independent tasks to achieve separately (MIMD), on subsets of processors of identical behaviour (SIMD), in order to translate the hierarchical inclusion of data-parallelism in task-parallelism. To manipulate in a symmetrical way tasks and data we propose meta-operations which have the same behaviour on tasks arrays and on data arrays. We explain how to implement this language on our parallel computer SYMPHONIE in order to profit by the locally-shared memory, by the hardware virtualization, and by the multiplicity of communications networks. We analyse simultaneously a typical application of such architecture. Finite elements scheme for Fluid mechanic needs powerful parallel computers and requires large floating points abilities. Lattice gases is an alternative to such simulations. Boolean lattice bases are simple, stable, modular, need to floating point computation, but include numerical noise. Boltzmann lattice gases present large precision of computation, but needs floating points and are only locally stable. We propose a new scheme, called multi-bit, who keeps the advantages of each boolean model to which it is applied, with large numerical precision and reduced noise. Experiments on viscosity, physical behaviour, noise reduction and spurious invariants are shown and implementation techniques for parallel Multi-SIMD computers detailed. (author) [fr

  15. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  16. New Computational Model Based on Finite Element Method to Quantify Damage Evolution Due to External Sulfate Attack on Self-Compacting Concretes

    KAUST Repository

    Khelifa, Mohammed Rissel; Guessasma, Sofiane

    2012-01-01

    Abstract: This work combines experimental and numerical investigations to study the mechanical degradation of self-compacting concrete under accelerated aging conditions. Four different experimental treatments are tested among them constant immersion and immersion-drying protocols allow an efficient external sulfate attack of the material. Significant damage is observed due to interfacial ettringite. A predictive analysis is then adopted to quantify the relationship between ettringite growth and mechanical damage evolution during aging. Typical 3D microstructures representing the cement paste-aggregate structures are generated using Monte Carlo scheme. These images are converted into a finite element model to predict the mechanical performance under different criteria of damage kinetics. The effect of ettringite is then associated to the development of an interphase of lower mechanical properties. Our results show that the observed time evolution of Young's modulus is best described by a linear increase of the interphase content. Our model results indicate also that the interphase regions grow at maximum stress regions rather than exclusively at interfaces. Finally, constant immersion predicts a rate of damage growth five times lower than that of immersion-drying protocol. © 2012 Computer-Aided Civil and Infrastructure Engineering.

  17. New Computational Model Based on Finite Element Method to Quantify Damage Evolution Due to External Sulfate Attack on Self-Compacting Concretes

    KAUST Repository

    Khelifa, Mohammed Rissel

    2012-12-27

    Abstract: This work combines experimental and numerical investigations to study the mechanical degradation of self-compacting concrete under accelerated aging conditions. Four different experimental treatments are tested among them constant immersion and immersion-drying protocols allow an efficient external sulfate attack of the material. Significant damage is observed due to interfacial ettringite. A predictive analysis is then adopted to quantify the relationship between ettringite growth and mechanical damage evolution during aging. Typical 3D microstructures representing the cement paste-aggregate structures are generated using Monte Carlo scheme. These images are converted into a finite element model to predict the mechanical performance under different criteria of damage kinetics. The effect of ettringite is then associated to the development of an interphase of lower mechanical properties. Our results show that the observed time evolution of Young\\'s modulus is best described by a linear increase of the interphase content. Our model results indicate also that the interphase regions grow at maximum stress regions rather than exclusively at interfaces. Finally, constant immersion predicts a rate of damage growth five times lower than that of immersion-drying protocol. © 2012 Computer-Aided Civil and Infrastructure Engineering.

  18. Optimal design method for a digital human–computer interface based on human reliability in a nuclear power plant. Part 3: Optimization method for interface task layout

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Wang, Yiqun; Zhang, Li; Xie, Tian; Li, Min; Peng, Yuyuan; Wu, Daqing; Li, Peiyao; Ma, Congmin; Shen, Mengxu; Wu, Xing; Weng, Mengyun; Wang, Shiwei; Xie, Cen

    2016-01-01

    Highlights: • The authors present an optimization algorithm for interface task layout. • The performing process of the proposed algorithm was depicted. • The performance evaluation method adopted neural network method. • The optimization layouts of an event interface tasks were obtained by experiments. - Abstract: This is the last in a series of papers describing the optimal design for a digital human–computer interface of a nuclear power plant (NPP) from three different points based on human reliability. The purpose of this series is to propose different optimization methods from varying perspectives to decrease human factor events that arise from the defects of a human–computer interface. The present paper mainly solves the optimization method as to how to effectively layout interface tasks into different screens. The purpose of this paper is to decrease human errors by reducing the distance that an operator moves among different screens in each operation. In order to resolve the problem, the authors propose an optimization process of interface task layout for digital human–computer interface of a NPP. As to how to automatically layout each interface task into one of screens in each operation, the paper presents a shortest moving path optimization algorithm with dynamic flag based on human reliability. To test the algorithm performance, the evaluation method uses neural network based on human reliability. The less the human error probabilities are, the better the interface task layouts among different screens are. Thus, by analyzing the performance of each interface task layout, the optimization result is obtained. Finally, the optimization layouts of spurious safety injection event interface tasks of the NPP are obtained by an experiment, the proposed methods has a good accuracy and stabilization.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. The Personality Trait of Intolerance to Uncertainty Affects Behavior in a Novel Computer-Based Conditioned Place Preference Task

    Directory of Open Access Journals (Sweden)

    Milen Radell

    2016-08-01

    Full Text Available Recent work has found that personality factors that confer vulnerability to addiction can also affect learning and economic decision making. One personality trait which has been implicated in vulnerability to addiction is intolerance to uncertainty (IU, i.e. a preference for familiar over unknown (possible better options. In animals, the motivation to obtain drugs is often assessed through conditioned place preference (CPP, which compares preference for contexts where drug reward was previously received. It is an open question whether participants with high IU also show heightened preference for previously-rewarded contexts. To address this question, we developed a novel computer-based CPP task for humans in which participants guide an avatar through a paradigm in which one room contains frequent reward and one contains less frequent reward. Following exposure to both contexts, subjects are assessed for preference to enter the previously-rich and previously-poor room. Individuals with low IU showed little bias to enter the previously-rich room first, and instead entered both rooms at about the same rate. By contrast, those with high IU showed a strong bias to enter the previously-rich room first. This suggests an increased tendency to chase reward in the intolerant group, consistent with previously observed behavior in opioid-addicted individuals. Thus, high IU may represent a pre-existing cognitive bias that provides a mechanism to promote decision-making processes that increase vulnerability to addiction.

  1. Connected Car: Quantified Self becomes Quantified Car

    Directory of Open Access Journals (Sweden)

    Melanie Swan

    2015-02-01

    Full Text Available The automotive industry could be facing a situation of profound change and opportunity in the coming decades. There are a number of influencing factors such as increasing urban and aging populations, self-driving cars, 3D parts printing, energy innovation, and new models of transportation service delivery (Zipcar, Uber. The connected car means that vehicles are now part of the connected world, continuously Internet-connected, generating and transmitting data, which on the one hand can be helpfully integrated into applications, like real-time traffic alerts broadcast to smartwatches, but also raises security and privacy concerns. This paper explores the automotive connected world, and describes five killer QS (Quantified Self-auto sensor applications that link quantified-self sensors (sensors that measure the personal biometrics of individuals like heart rate and automotive sensors (sensors that measure driver and passenger biometrics or quantitative automotive performance metrics like speed and braking activity. The applications are fatigue detection, real-time assistance for parking and accidents, anger management and stress reduction, keyless authentication and digital identity verification, and DIY diagnostics. These kinds of applications help to demonstrate the benefit of connected world data streams in the automotive industry and beyond where, more fundamentally for human progress, the automation of both physical and now cognitive tasks is underway.

  2. Interfractional variability of respiration-induced esophageal tumor motion quantified using fiducial markers and four-dimensional cone-beam computed tomography.

    Science.gov (United States)

    Jin, Peng; Hulshof, Maarten C C M; van Wieringen, Niek; Bel, Arjan; Alderliesten, Tanja

    2017-07-01

    To investigate the interfractional variability of respiration-induced esophageal tumor motion using fiducial markers and four-dimensional cone-beam computed tomography (4D-CBCT) and assess if a 4D-CT is sufficient for predicting the motion during the treatment. Twenty-four patients with 63 markers visible in the retrospectively reconstructed 4D-CBCTs were included. For each marker, we calculated the amplitude and trajectory of the respiration-induced motion. Possible time trends of the amplitude over the treatment course and the interfractional variability of amplitudes and trajectory shapes were assessed. Further, the amplitudes measured in the 4D-CT were compared to those in the 4D-CBCTs. The amplitude was largest in the cranial-caudal direction of the distal esophagus (mean: 7.1mm) and proximal stomach (mean: 7.8mm). No time trend was observed in the amplitude over the treatment course. The interfractional variability of amplitudes and trajectory shapes was limited (mean: ≤1.4mm). Moreover, small and insignificant deviation was found between the amplitudes quantified in the 4D-CT and in the 4D-CBCT (mean absolute difference: ≤1.0mm). The limited interfractional variability of amplitudes and trajectory shapes and small amplitude difference between 4D-CT-based and 4D-CBCT-based measurements imply that a single 4D-CT would be sufficient for predicting the respiration-induced esophageal tumor motion during the treatment course. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Visualizing and Quantifying Bioaccessible Pores in Field-Aged Petroleum Hydrocarbon-Contaminated Clay Soils Using Synchrotron-based X-ray Computed Tomography

    Science.gov (United States)

    Chang, W.; Kim, J.; Zhu, N.; McBeth, J. M.

    2015-12-01

    Microbial hydrocarbon degradation is environmentally significant and applicable to contaminated site remediation practices only when hydrocarbons (substrates) are physically bioaccessible to bacteria in soil matrices. Powerful X-rays are produced by synchrotron radiation, allowing for bioaccessible pores in soil (larger than 4 microns), where bacteria can be accommodated, colonize and remain active, can be visualized at a much higher resolution. This study visualized and quantified such bioaccessible pores in intact field-aged, oil-contaminated unsaturated soil fractions, and examined the relationship between the abundance of bioaccessible pores and hydrocarbon biodegradation. Using synchrotron-based X-ray Computed Tomography (CT) at the Canadian Light Source, a large dataset of soil particle characteristics, such as pore volumes, surface areas, number of pores and pore size distribution, was generated. Duplicate samples of five different soil fractions with different soil aggregate sizes and water contents (13, 18 and 25%) were examined. The method for calculating the number and distribution of bioaccessible pores using CT images was validated using the known porosity of Ottawa sand. This study indicated that the distribution of bioaccessible pore sizes in soil fractions are very closely related to microbial enhancement. A follow-up aerobic biodegradation experiment for the soils at 17 °C (average site temperature) over 90 days confirmed that a notable decrease in hydrocarbon concentrations occurred in soils fractions with abundant bioaccessible pores and with a larger number of pores between 10 and 100 μm. The hydrocarbon degradation in bioactive soil fractions was extended to relatively high-molecular-weight hydrocarbons (C16-C34). This study provides quantitative information about how internal soil pore characteristics can influence bioremediation performance.

  4. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    task (Bahrami et al 2010, Fusaroli et al. 2012) we extend to linguistic coordination dynamical measures of recurrence employed in the analysis of sensorimotor coordination (such as heart-rate (Konvalinka et al 2011), postural sway (Shockley 2005) and eye-movements (Dale, Richardson and Kirkham 2012......). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities...

  5. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  6. Multi-task transfer learning deep convolutional neural network: application to computer-aided diagnosis of breast cancer on mammograms

    Science.gov (United States)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Helvie, Mark A.; Cha, Kenny H.; Richter, Caleb D.

    2017-12-01

    Transfer learning in deep convolutional neural networks (DCNNs) is an important step in its application to medical imaging tasks. We propose a multi-task transfer learning DCNN with the aim of translating the ‘knowledge’ learned from non-medical images to medical diagnostic tasks through supervised training and increasing the generalization capabilities of DCNNs by simultaneously learning auxiliary tasks. We studied this approach in an important application: classification of malignant and benign breast masses. With Institutional Review Board (IRB) approval, digitized screen-film mammograms (SFMs) and digital mammograms (DMs) were collected from our patient files and additional SFMs were obtained from the Digital Database for Screening Mammography. The data set consisted of 2242 views with 2454 masses (1057 malignant, 1397 benign). In single-task transfer learning, the DCNN was trained and tested on SFMs. In multi-task transfer learning, SFMs and DMs were used to train the DCNN, which was then tested on SFMs. N-fold cross-validation with the training set was used for training and parameter optimization. On the independent test set, the multi-task transfer learning DCNN was found to have significantly (p  =  0.007) higher performance compared to the single-task transfer learning DCNN. This study demonstrates that multi-task transfer learning may be an effective approach for training DCNN in medical imaging applications when training samples from a single modality are limited.

  7. The scopolamine-reversal paradigm in rats and monkeys: the importance of computer-assisted operant-conditioning memory tasks for screening drug candidates.

    Science.gov (United States)

    Buccafusco, Jerry J; Terry, Alvin V; Webster, Scott J; Martin, Daniel; Hohnadel, Elizabeth J; Bouchard, Kristy A; Warner, Samantha E

    2008-08-01

    The scopolamine-reversal model is enjoying a resurgence of interest in clinical studies as a reversible pharmacological model for Alzheimer's disease (AD). The cognitive impairment associated with scopolamine is similar to that in AD. The scopolamine model is not simply a cholinergic model, as it can be reversed by drugs that are noncholinergic cognition-enhancing agents. The objective of the study was to determine relevance of computer-assisted operant-conditioning tasks in the scopolamine-reversal model in rats and monkeys. Rats were evaluated for their acquisition of a spatial reference memory task in the Morris water maze. A separate cohort was proficient in performance of an automated delayed stimulus discrimination task (DSDT). Rhesus monkeys were proficient in the performance of an automated delayed matching-to-sample task (DMTS). The AD drug donepezil was evaluated for its ability to reverse the decrements in accuracy induced by scopolamine administration in all three tasks. In the DSDT and DMTS tasks, the effects of donepezil were delay (retention interval)-dependent, affecting primarily short delay trials. Donepezil produced significant but partial reversals of the scopolamine-induced impairment in task accuracies after 2 mg/kg in the water maze, after 1 mg/kg in the DSDT, and after 50 microg/kg in the DMTS task. The two operant-conditioning tasks (DSDT and DMTS) provided data most in keeping with those reported in clinical studies with these drugs. The model applied to nonhuman primates provides an excellent transitional model for new cognition-enhancing drugs before clinical trials.

  8. Can smartphones be used to bring computer-based tasks from the lab to the field? A mobile experience-sampling method study about the pace of life.

    Science.gov (United States)

    Stieger, Stefan; Lewetz, David; Reips, Ulf-Dietrich

    2017-12-06

    Researchers are increasingly using smartphones to collect scientific data. To date, most smartphone studies have collected questionnaire data or data from the built-in sensors. So far, few studies have analyzed whether smartphones can also be used to conduct computer-based tasks (CBTs). Using a mobile experience-sampling method study and a computer-based tapping task as examples (N = 246; twice a day for three weeks, 6,000+ measurements), we analyzed how well smartphones can be used to conduct a CBT. We assessed methodological aspects such as potential technologically induced problems, dropout, task noncompliance, and the accuracy of millisecond measurements. Overall, we found few problems: Dropout rate was low, and the time measurements were very accurate. Nevertheless, particularly at the beginning of the study, some participants did not comply with the task instructions, probably because they did not read the instructions before beginning the task. To summarize, the results suggest that smartphones can be used to transfer CBTs from the lab to the field, and that real-world variations across device manufacturers, OS types, and CPU load conditions did not substantially distort the results.

  9. Treatment Effect of Balloon Pulmonary Angioplasty in Chronic Thromboembolic Pulmonary Hypertension Quantified by Automatic Comparative Imaging in Computed Tomography Pulmonary Angiography.

    Science.gov (United States)

    Zhai, Zhiwei; Ota, Hideki; Staring, Marius; Stolk, Jan; Sugimura, Koichiro; Takase, Kei; Stoel, Berend C

    2018-05-01

    Balloon pulmonary angioplasty (BPA) in patients with inoperable chronic thromboembolic pulmonary hypertension (CTEPH) can have variable outcomes. To gain more insight into this variation, we designed a method for visualizing and quantifying changes in pulmonary perfusion by automatically comparing computed tomography (CT) pulmonary angiography before and after BPA treatment. We validated these quantifications of perfusion changes against hemodynamic changes measured with right-sided heart catheterization. We studied 14 consecutive CTEPH patients (12 women; age, 70.5 ± 24), who underwent CT pulmonary angiography and right-sided heart catheterization, before and after BPA. Posttreatment images were registered to pretreatment CT scans (using the Elastix toolbox) to obtain corresponding locations. Pulmonary vascular trees and their centerlines were detected using a graph cuts method and a distance transform method, respectively. Areas distal from vessels were defined as pulmonary parenchyma. Subsequently, the density changes within the vascular centerlines and parenchymal areas were calculated and corrected for inspiration level differences. For visualization, the densitometric changes were displayed in color-coded overlays. For quantification, the median and interquartile range of the density changes in the vascular and parenchymal areas (ΔVD and ΔPD) were calculated. The recorded changes in hemodynamic parameters, including changes in systolic, diastolic, and mean pulmonary artery pressure (ΔsPAP, ΔdPAP, and ΔmPAP, respectively) and vascular resistance (ΔPVR), were used as reference assessments of the treatment effect. Spearman correlation coefficients were employed to investigate the correlations between changes in perfusion and hemodynamic changes. Comparative imaging maps showed distinct patterns in perfusion changes among patients. Within pulmonary vessels, the interquartile range of ΔVD correlated significantly with ΔsPAP (R = -0.58, P = 0.03),

  10. Quantifying light pollution

    International Nuclear Information System (INIS)

    Cinzano, P.; Falchi, F.

    2014-01-01

    In this paper we review new available indicators useful to quantify and monitor light pollution, defined as the alteration of the natural quantity of light in the night environment due to introduction of manmade light. With the introduction of recent radiative transfer methods for the computation of light pollution propagation, several new indicators become available. These indicators represent a primary step in light pollution quantification, beyond the bare evaluation of the night sky brightness, which is an observational effect integrated along the line of sight and thus lacking the three-dimensional information. - Highlights: • We review new available indicators useful to quantify and monitor light pollution. • These indicators are a primary step in light pollution quantification. • These indicators allow to improve light pollution mapping from a 2D to a 3D grid. • These indicators allow carrying out a tomography of light pollution. • We show an application of this technique to an Italian region

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  12. Application of computational fluid dynamics and pedestrian-behavior simulations to the design of task-ambient air-conditioning systems of a subway station

    Energy Technology Data Exchange (ETDEWEB)

    Fukuyo, Kazuhiro [Graduate School of Innovation and Technology Management, Faculty of Engineering, Yamaguchi University, Tokiwadai 2-16-1, Ube, Yamaguchi 755-8611 (Japan)

    2006-04-15

    The effects of task-ambient (TA) air-conditioning systems on the air-conditioning loads in a subway station and the thermal comfort of passengers were studied using computational fluid dynamics (CFD) and pedestrian-behavior simulations. The pedestrian-behavior model was applied to a standard subway station. Task areas were set up to match with crowdedness as predicted by the pedestrian-behavior simulations. Subsequently, a variety of TA air-conditioning systems were designed to selectively control the microclimate of the task areas. Their effects on the thermal environment in the station in winter were predicted by CFD. The results were compared with those of a conventional air-conditioning system and evaluated in relation to the thermal comfort of subway users and the air-conditioning loads. The comparison showed that TA air-conditioning systems improved thermal comfort and decreased air-conditioning loads. (author)

  13. Text-Based Language Teaching and the Analysis of Tasks Presented in English Course Books for Students of Information Technology and Computing

    Directory of Open Access Journals (Sweden)

    Valerija Marina

    2011-04-01

    Full Text Available The paper describes the essential features of a connected text helping to raise learners’ awareness of its structure and organization and improve their skills of reading comprehension. Classroom applications of various approaches to handling texts and text-based activities are also discussed and their main advantages and disadvantages are outlined.Tasks based on text transformation and reconstruction found in the course books of English for students of computing and information technology are analysed and their types are determined. The efficiency of the tasks is determined by considering the experience of the authors gained in using text-based assignments provided in these course books with the students of the above specialities. Some problems encountered in classroom application of the considered text-based tasks are also outlined.

  14. Effects of Distracting Task with Different Mental Workload on Steady-State Visual Evoked Potential Based Brain Computer Interfaces—an Offline Study

    Directory of Open Access Journals (Sweden)

    Yawei Zhao

    2018-02-01

    Full Text Available Brain-computer interfaces (BCIs, independent of the brain's normal output pathways, are attracting an increasing amount of attention as devices that extract neural information. As a typical type of BCI system, the steady-state visual evoked potential (SSVEP-based BCIs possess a high signal-to-noise ratio and information transfer rate. However, the current high speed SSVEP-BCIs were implemented with subjects concentrating on stimuli, and intentionally avoided additional tasks as distractors. This paper aimed to investigate how a distracting simultaneous task, a verbal n-back task with different mental workload, would affect the performance of SSVEP-BCI. The results from fifteen subjects revealed that the recognition accuracy of SSVEP-BCI was significantly impaired by the distracting task, especially under a high mental workload. The average classification accuracy across all subjects dropped by 8.67% at most from 1- to 4-back, and there was a significant negative correlation (maximum r = −0.48, p < 0.001 between accuracy and subjective mental workload evaluation of the distracting task. This study suggests a potential hindrance for the SSVEP-BCI daily use, and then improvements should be investigated in the future studies.

  15. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  16. Task-oriented training with computer gaming in people with rheumatoid arthritisor osteoarthritis of the hand: study protocol of a randomized controlled pilot trial.

    Science.gov (United States)

    Srikesavan, Cynthia Swarnalatha; Shay, Barbara; Robinson, David B; Szturm, Tony

    2013-03-09

    Significant restriction in the ability to participate in home, work and community life results from pain, fatigue, joint damage, stiffness and reduced joint range of motion and muscle strength in people with rheumatoid arthritis or osteoarthritis of the hand. With modest evidence on the therapeutic effectiveness of conventional hand exercises, a task-oriented training program via real life object manipulations has been developed for people with arthritis. An innovative, computer-based gaming platform that allows a broad range of common objects to be seamlessly transformed into therapeutic input devices through instrumentation with a motion-sense mouse has also been designed. Personalized objects are selected to target specific training goals such as graded finger mobility, strength, endurance or fine/gross dexterous functions. The movements and object manipulation tasks that replicate common situations in everyday living will then be used to control and play any computer game, making practice challenging and engaging. The ongoing study is a 6-week, single-center, parallel-group, equally allocated and assessor-blinded pilot randomized controlled trial. Thirty people with rheumatoid arthritis or osteoarthritis affecting the hand will be randomized to receive either conventional hand exercises or the task-oriented training. The purpose is to determine a preliminary estimation of therapeutic effectiveness and feasibility of the task-oriented training program. Performance based and self-reported hand function, and exercise compliance are the study outcomes. Changes in outcomes (pre to post intervention) within each group will be assessed by paired Student t test or Wilcoxon signed-rank test and between groups (control versus experimental) post intervention using unpaired Student t test or Mann-Whitney U test. The study findings will inform decisions on the feasibility, safety and completion rate and will also provide preliminary data on the treatment effects of the task

  17. Using a computational model to quantify the potential impact of changing the placement of healthy beverages in stores as an intervention to "Nudge" adolescent behavior choice.

    Science.gov (United States)

    Wong, Michelle S; Nau, Claudia; Kharmats, Anna Yevgenyevna; Vedovato, Gabriela Milhassi; Cheskin, Lawrence J; Gittelsohn, Joel; Lee, Bruce Y

    2015-12-23

    Product placement influences consumer choices in retail stores. While sugar sweetened beverage (SSB) manufacturers expend considerable effort and resources to determine how product placement may increase SSB purchases, the information is proprietary and not available to the public health and research community. This study aims to quantify the effect of non-SSB product placement in corner stores on adolescent beverage purchasing behavior. Corner stores are small privately owned retail stores that are important beverage providers in low-income neighborhoods--where adolescents have higher rates of obesity. Using data from a community-based survey in Baltimore and parameters from the marketing literature, we developed a decision-analytic model to simulate and quantify how placement of healthy beverage (placement in beverage cooler closest to entrance, distance from back of the store, and vertical placement within each cooler) affects the probability of adolescents purchasing non-SSBs. In our simulation, non-SSB purchases were 2.8 times higher when placed in the "optimal location"--on the second or third shelves of the front cooler--compared to the worst location on the bottom shelf of the cooler farthest from the entrance. Based on our model results and survey data, we project that moving non-SSBs from the worst to the optional location would result in approximately 5.2 million more non-SSBs purchased by Baltimore adolescents annually. Our study is the first to quantify the potential impact of changing placement of beverages in corner stores. Our findings suggest that this could be a low-cost, yet impactful strategy to nudge this population--highly susceptible to obesity--towards healthier beverage decisions.

  18. Automated generation of patient-tailored electronic care pathways by translating computer-interpretable guidelines into hierarchical task networks

    NARCIS (Netherlands)

    González-Ferrer, A.; ten Teije, A.C.M.; Fdez-Olivares, J.; Milian, K.

    OBJECTIVE: This paper describes a methodology which enables computer-aided support for the planning, visualization and execution of personalized patient treatments in a specific healthcare process, taking into account complex temporal constraints and the allocation of institutional resources. To

  19. CFD computations of wind turbine blade loads during standstill operation KNOW-BLADE, Task 3.1 report

    DEFF Research Database (Denmark)

    Sørensen, Niels N.; Johansen, Jeppe; Conway, S.

    2004-01-01

    Two rotors blades are computed during standstill conditions, using two different Navier-Stokes solvers EDGE and EllipSys3D. Both steady and transient linear k-? RANS turbulence models are applied, along with steady non-linear RANS and transient DESsimulations. The STORK 5.0 WPX blade is computed...... be explained by the difference in the applied turbulence models and the fact that the results from one of the solvers are presented as instantaneous valuesinstead of averaged values. The comparison of steady and transient RANS results show that the gain of using time true computations are very limited...... a three different tip pitch angles, 0, 26 and 50 degrees tip pitch angle, while the NREL Phase-VI blade is computed at 90 degrees tip pitch angle. Generally the CFD codes reproduce the measured trends quitewell and the two involved CFD codes give very similar results. The discrepancies observed can...

  20. Quantifying Appropriate PTV Setup Margins: Analysis of Patient Setup Fidelity and Intrafraction Motion Using Post-Treatment Megavoltage Computed Tomography Scans

    International Nuclear Information System (INIS)

    Drabik, Donata M.; MacKenzie, Marc A.; Fallone, Gino B.

    2007-01-01

    Purpose: To present a technique that can be implemented in-house to evaluate the efficacy of immobilization and image-guided setup of patients with different treatment sites on helical tomotherapy. This technique uses an analysis of alignment shifts between kilovoltage computed tomography and post-treatment megavoltage computed tomography images. The determination of the shifts calculated by the helical tomotherapy software for a given site can then be used to define appropriate planning target volume internal margins. Methods and Materials: Twelve patients underwent post-treatment megavoltage computed tomography scans on a helical tomotherapy machine to assess patient setup fidelity and net intrafraction motion. Shifts were studied for the prostate, head and neck, and glioblastoma multiforme. Analysis of these data was performed using automatic and manual registration of the kilovoltage computed tomography and post-megavoltage computed tomography images. Results: The shifts were largest for the prostate, followed by the head and neck, with glioblastoma multiforme having the smallest shifts in general. It appears that it might be more appropriate to use asymmetric planning target volume margins. Each margin value reported is equal to two standard deviations of the average shift in the given direction. Conclusion: This method could be applied using individual patient post-image scanning and combined with adaptive planning to reduce or increase the margins as appropriate

  1. Modified CC-LR algorithm with three diverse feature sets for motor imagery tasks classification in EEG based brain-computer interface.

    Science.gov (United States)

    Siuly; Li, Yan; Paul Wen, Peng

    2014-03-01

    Motor imagery (MI) tasks classification provides an important basis for designing brain-computer interface (BCI) systems. If the MI tasks are reliably distinguished through identifying typical patterns in electroencephalography (EEG) data, a motor disabled people could communicate with a device by composing sequences of these mental states. In our earlier study, we developed a cross-correlation based logistic regression (CC-LR) algorithm for the classification of MI tasks for BCI applications, but its performance was not satisfactory. This study develops a modified version of the CC-LR algorithm exploring a suitable feature set that can improve the performance. The modified CC-LR algorithm uses the C3 electrode channel (in the international 10-20 system) as a reference channel for the cross-correlation (CC) technique and applies three diverse feature sets separately, as the input to the logistic regression (LR) classifier. The present algorithm investigates which feature set is the best to characterize the distribution of MI tasks based EEG data. This study also provides an insight into how to select a reference channel for the CC technique with EEG signals considering the anatomical structure of the human brain. The proposed algorithm is compared with eight of the most recently reported well-known methods including the BCI III Winner algorithm. The findings of this study indicate that the modified CC-LR algorithm has potential to improve the identification performance of MI tasks in BCI systems. The results demonstrate that the proposed technique provides a classification improvement over the existing methods tested. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. CFD computations of wind turbine blade loads during standstill operation KNOW-BLADE, Task 3.1 report

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, N.N.; Johansen, J.; Conway, S.

    2004-06-01

    Two rotors blades are computed during standstill conditions, using two different Navier-Stokes solvers EDGE and EllipSys3D. Both steady and transient linear {kappa} - {omega} RANS turbulence models are applied, along with steady non-linear RANS and transient DES simulations. The STORK 5.0 WPX blade is computed a three different tip pitch angles, 0, 26 and 50 degrees tip pitch angle, while the NREL Phase-VI blade is computed at 90 degrees tip pitch angle. Generally the CFD codes reproduce the measured trends quite well and the two involved CFD codes give very similar results. The discrepancies observed can be explained by the difference in the applied turbulence models and the fact that the results from one of the solvers are presented as instantaneous values instead of averaged values. The comparison of steady and transient RANS results show that the gain of using time true computations are very limited for this case, with respect to mean quantities. The same can be said for the RANS/DES comparison performed for the NREL rotor, even though the DES computation shows improved agreement at the tip and root sections. Finally, it is shown that the DES methodology provides a much more physical representation of the heavily stalled part of the flow over blades at high angles of attack. (au)

  3. Prospects of a mathematical theory of human behavior in complex man-machine systems tasks. [time sharing computer analogy of automobile driving

    Science.gov (United States)

    Johannsen, G.; Rouse, W. B.

    1978-01-01

    A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.

  4. APA Summit on Medical Student Education Task Force on Informatics and Technology: learning about computers and applying computer technology to education and practice.

    Science.gov (United States)

    Hilty, Donald M; Hales, Deborah J; Briscoe, Greg; Benjamin, Sheldon; Boland, Robert J; Luo, John S; Chan, Carlyle H; Kennedy, Robert S; Karlinsky, Harry; Gordon, Daniel B; Yager, Joel; Yellowlees, Peter M

    2006-01-01

    This article provides a brief overview of important issues for educators regarding medical education and technology. The literature describes key concepts, prototypical technology tools, and model programs. A work group of psychiatric educators was convened three times by phone conference to discuss the literature. Findings were presented to and input was received from the 2005 Summit on Medical Student Education by APA and the American Directors of Medical Student Education in Psychiatry. Knowledge of, skills in, and attitudes toward medical informatics are important to life-long learning and modern medical practice. A needs assessment is a starting place, since student, faculty, institution, and societal factors bear consideration. Technology needs to "fit" into a curriculum in order to facilitate learning and teaching. Learning about computers and applying computer technology to education and clinical care are key steps in computer literacy for physicians.

  5. Development of a PBPK model of thiocyanate in rats with an extrapolation to humans: A computational study to quantify the mechanism of action of thiocyanate kinetics in thyroid

    International Nuclear Information System (INIS)

    Willemin, Marie-Emilie; Lumen, Annie

    2016-01-01

    Thyroid homeostasis can be disturbed due to thiocyanate exposure from the diet or tobacco smoke. Thiocyanate inhibits both thyroidal uptake of iodide, via the sodium-iodide symporter (NIS), and thyroid hormone (TH) synthesis in the thyroid, via thyroid peroxidase (TPO), but the mode of action of thiocyanate is poorly quantified in the literature. The characterization of the link between intra-thyroidal thiocyanate concentrations and dose of exposure is crucial for assessing the risk of thyroid perturbations due to thiocyanate exposure. We developed a PBPK model for thiocyanate that describes its kinetics in the whole-body up to daily doses of 0.15 mmol/kg, with a mechanistic description of the thyroidal kinetics including NIS, passive diffusion, and TPO. The model was calibrated in a Bayesian framework using published studies in rats. Goodness-of-fit was satisfactory, especially for intra-thyroidal thiocyanate concentrations. Thiocyanate kinetic processes were quantified in vivo, including the metabolic clearance by TPO. The passive diffusion rate was found to be greater than NIS-mediated uptake rate. The model captured the dose-dependent kinetics of thiocyanate after acute and chronic exposures. Model behavior was evaluated using a Morris screening test. The distribution of thiocyanate into the thyroid was found to be determined primarily by the partition coefficient, followed by NIS and passive diffusion; the impact of the latter two mechanisms appears to increase at very low doses. Extrapolation to humans resulted in good predictions of thiocyanate kinetics during chronic exposure. The developed PBPK model can be used in risk assessment to quantify dose-response effects of thiocyanate on TH. - Highlights: • A PBPK model of thiocyanate (SCN − ) was calibrated in rats in a Bayesian framework. • The intra-thyroidal kinetics of thiocyanate including NIS and TPO was modeled. • Passive diffusion rate for SCN − seemed to be greater than the NIS

  6. Development of a PBPK model of thiocyanate in rats with an extrapolation to humans: A computational study to quantify the mechanism of action of thiocyanate kinetics in thyroid

    Energy Technology Data Exchange (ETDEWEB)

    Willemin, Marie-Emilie; Lumen, Annie, E-mail: Annie.Lumen@fda.hhs.gov

    2016-09-15

    Thyroid homeostasis can be disturbed due to thiocyanate exposure from the diet or tobacco smoke. Thiocyanate inhibits both thyroidal uptake of iodide, via the sodium-iodide symporter (NIS), and thyroid hormone (TH) synthesis in the thyroid, via thyroid peroxidase (TPO), but the mode of action of thiocyanate is poorly quantified in the literature. The characterization of the link between intra-thyroidal thiocyanate concentrations and dose of exposure is crucial for assessing the risk of thyroid perturbations due to thiocyanate exposure. We developed a PBPK model for thiocyanate that describes its kinetics in the whole-body up to daily doses of 0.15 mmol/kg, with a mechanistic description of the thyroidal kinetics including NIS, passive diffusion, and TPO. The model was calibrated in a Bayesian framework using published studies in rats. Goodness-of-fit was satisfactory, especially for intra-thyroidal thiocyanate concentrations. Thiocyanate kinetic processes were quantified in vivo, including the metabolic clearance by TPO. The passive diffusion rate was found to be greater than NIS-mediated uptake rate. The model captured the dose-dependent kinetics of thiocyanate after acute and chronic exposures. Model behavior was evaluated using a Morris screening test. The distribution of thiocyanate into the thyroid was found to be determined primarily by the partition coefficient, followed by NIS and passive diffusion; the impact of the latter two mechanisms appears to increase at very low doses. Extrapolation to humans resulted in good predictions of thiocyanate kinetics during chronic exposure. The developed PBPK model can be used in risk assessment to quantify dose-response effects of thiocyanate on TH. - Highlights: • A PBPK model of thiocyanate (SCN{sup −}) was calibrated in rats in a Bayesian framework. • The intra-thyroidal kinetics of thiocyanate including NIS and TPO was modeled. • Passive diffusion rate for SCN{sup −} seemed to be greater than the NIS

  7. Mobile computing with special reference to readability task under the impact of vibration, colour combination and gender.

    Science.gov (United States)

    Mallick, Zulquernain; Siddiquee, Arshad Noor; Haleem, Abid

    2008-12-01

    The last 20 years have seen a tremendous growth in the field of computing with special reference to mobile computing. Ergonomic issues pertaining to this theme remains unexplored. With special reference to readability in mobile computing, an experimental research was conducted to study the gender effect on human performance under the impact of vibration in a human computer interaction environment. Fourteen subjects (7 males and 7 females) participated in the study. Three independent variables, namely gender, level of vibration and screen text/background colour, were selected for the experimental investigation while the dependent variable was the number of characters read per minute. The data collected were analyzed statistically through an experimental design for repeated measures. Results indicated that gender as an organismic variable, the level of vibration and screen text/background colour revealed statistically significant differences. However, the second order interaction was found to be statistically non-significant. These findings are discussed in light of the previous studies undertaken on the topic.

  8. Job tasks, computer use, and the decreasing part-time pay penalty for women in the UK

    NARCIS (Netherlands)

    Elsayed, A.E.A.; de Grip, A.; Fouarge, D.

    2014-01-01

    Using data from the UK Skills Surveys, we show that the part-time pay penalty for female workers within low- and medium-skilled occupations decreased significantly over the period 1997-2006. The convergence in computer use between part-time and full-time workers within these occupations explains a

  9. Effects of the Memorization of Rule Statements on Performance, Retention, and Transfer in a Computer-Based Learning Task.

    Science.gov (United States)

    Towle, Nelson J.

    Research sought to determine whether memorization of rule statements before, during or after instruction in rule application skills would facilitate the acquisition and/or retention of rule-governed behavior as compared to no-rule statement memorization. A computer-assisted instructional (CAI) program required high school students to learn to a…

  10. User Experience May be Producing Greater Heart Rate Variability than Motor Imagery Related Control Tasks during the User-System Adaptation in Brain-Computer Interfaces

    Science.gov (United States)

    Alonso-Valerdi, Luz M.; Gutiérrez-Begovich, David A.; Argüello-García, Janet; Sepulveda, Francisco; Ramírez-Mendoza, Ricardo A.

    2016-01-01

    Brain-computer interface (BCI) is technology that is developing fast, but it remains inaccurate, unreliable and slow due to the difficulty to obtain precise information from the brain. Consequently, the involvement of other biosignals to decode the user control tasks has risen in importance. A traditional way to operate a BCI system is via motor imagery (MI) tasks. As imaginary movements activate similar cortical structures and vegetative mechanisms as a voluntary movement does, heart rate variability (HRV) has been proposed as a parameter to improve the detection of MI related control tasks. However, HR is very susceptible to body needs and environmental demands, and as BCI systems require high levels of attention, perceptual processing and mental workload, it is important to assess the practical effectiveness of HRV. The present study aimed to determine if brain and heart electrical signals (HRV) are modulated by MI activity used to control a BCI system, or if HRV is modulated by the user perceptions and responses that result from the operation of a BCI system (i.e., user experience). For this purpose, a database of 11 participants who were exposed to eight different situations was used. The sensory-cognitive load (intake and rejection tasks) was controlled in those situations. Two electrophysiological signals were utilized: electroencephalography and electrocardiography. From those biosignals, event-related (de-)synchronization maps and event-related HR changes were respectively estimated. The maps and the HR changes were cross-correlated in order to verify if both biosignals were modulated due to MI activity. The results suggest that HR varies according to the experience undergone by the user in a BCI working environment, and not because of the MI activity used to operate the system. PMID:27458384

  11. Job tasks, computer use, and the decreasing part-time pay penalty for women in the UK

    OpenAIRE

    Elsayed, A.E.A.; de Grip, A.; Fouarge, D.

    2014-01-01

    Using data from the UK Skills Surveys, we show that the part-time pay penalty for female workers within low- and medium-skilled occupations decreased significantly over the period 1997-2006. The convergence in computer use between part-time and full-time workers within these occupations explains a large share of the decrease in the part-time pay penalty. However, the lower part-time pay penalty is also related to lower wage returns to reading and writing which are performed more intensively b...

  12. The use of EEG to measure cerebral changes during computer-based motion-sickness-inducing tasks

    Science.gov (United States)

    Strychacz, Christopher; Viirre, Erik; Wing, Shawn

    2005-05-01

    Motion sickness (MS) is a stressor commonly attributed with causing serious navigational and performance errors. The distinct nature of MS suggests this state may have distinct neural markers distinguishable from other states known to affect performance (e.g., stress, fatigue, sleep deprivation, high workload). This pilot study used new high-resolution electro-encephalograph (EEG) technologies to identify distinct neuronal activation changes that occur during MS. Brain EEG activity was monitored while subjects performed a ball-tracking task and viewed stimuli on a projection screen intended to induce motion sickness/spatial disorientation. Results show the presence of EEG spectral changes in all subjects who developed motion sickness when compared to baseline levels. These changes included: 1) low frequency (1 to 10 Hz) changes that may reflect oculomotor movements rather than intra-cerebral sources; 2) increased spectral power across all frequencies (attributable to increased scalp conductivity related to sweating), 3) local increases of power spectra in the 20-50 Hz range (likely attributable to external muscles on the skull) and; 4) a central posterior (occipital) independent component that shows suppression of a 20 Hz peak in the MS condition when compared to baseline. Further research is necessary to refine neural markers, characterize their origin and physiology, to distinguish between motion sickness and other states and to enable markers to be used for operator state monitoring and the designing of interventions for motion sickness.

  13. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  14. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  15. Rejection Positivity Predicts Trial-to-Trial Reaction Times in an Auditory Selective Attention Task: A Computational Analysis of Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Sufen eChen

    2014-08-01

    Full Text Available A series of computer simulations using variants of a formal model of attention (Melara & Algom, 2003 probed the role of rejection positivity (RP, a slow-wave electroencephalographic (EEG component, in the inhibitory control of distraction. Behavioral and EEG data were recorded as participants performed auditory selective attention tasks. Simulations that modulated processes of distractor inhibition accounted well for reaction-time (RT performance, whereas those that modulated target excitation did not. A model that incorporated RP from actual EEG recordings in estimating distractor inhibition was superior in predicting changes in RT as a function of distractor salience across conditions. A model that additionally incorporated momentary fluctuations in EEG as the source of trial-to-trial variation in performance precisely predicted individual RTs within each condition. The results lend support to the linking proposition that RP controls the speed of responding to targets through the inhibitory control of distractors.

  16. Video and computer-based interactive exercises are safe and improve task-specific balance in geriatric and neurological rehabilitation: a randomised trial

    Directory of Open Access Journals (Sweden)

    Maayken van den Berg

    2016-01-01

    Full Text Available Question: Does adding video/computer-based interactive exercises to inpatient geriatric and neurological rehabilitation improve mobility outcomes? Is it feasible and safe? Design: Randomised trial. Participants: Fifty-eight rehabilitation inpatients. Intervention: Physiotherapist-prescribed, tailored, video/computer-based interactive exercises for 1 hour on weekdays, mainly involving stepping and weight-shifting exercises. Outcome measures: The primary outcome was the Short Physical Performance Battery (0 to 3 at 2 weeks. Secondary outcomes were: Maximal Balance Range (mm; Step Test (step count; Rivermead Mobility Index (0 to 15; activity levels; Activity Measure for Post Acute Care Basic Mobility (18 to 72 and Daily Activity (15 to 60; Falls Efficacy Scale (10 to 40, ED5D utility score (0 to 1; Reintegration to Normal Living Index (0 to 100; System Usability Scale (0 to 100 and Physical Activity Enjoyment Scale (0 to 126. Safety was determined from adverse events during intervention. Results: At 2 weeks the between-group difference in the primary outcome (0.1, 95% CI –0.2 to 0.3 was not statistically significant. The intervention group performed significantly better than usual care for Maximal Balance Range (38 mm difference after baseline adjustment, 95% CI 6 to 69. Other secondary outcomes were not statistically significant. Fifty-eight (55% of the eligible patients agreed to participate, 25/29 (86% completed the intervention and 10 (39% attended > 70% of sessions, with a mean of 5.6 sessions (SD 3.3 attended and overall average duration of 4.5 hours (SD 3.1. Average scores were 62 (SD 21 for the System Usability Scale and 62 (SD 8 for the Physical Activity Enjoyment Scale. There were no adverse events. Conclusion: The addition of video/computer-based interactive exercises to usual rehabilitation is a safe and feasible way to increase exercise dose, but is not suitable for all. Adding the exercises to usual rehabilitation resulted in task

  17. Volume of Lytic Vertebral Body Metastatic Disease Quantified Using Computed Tomography–Based Image Segmentation Predicts Fracture Risk After Spine Stereotactic Body Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Thibault, Isabelle [Department of Radiation Oncology, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario (Canada); Department of Radiation Oncology, Centre Hospitalier de L' Universite de Québec–Université Laval, Quebec, Quebec (Canada); Whyne, Cari M. [Orthopaedic Biomechanics Laboratory, Sunnybrook Research Institute, Department of Surgery, University of Toronto, Toronto, Ontario (Canada); Zhou, Stephanie; Campbell, Mikki [Department of Radiation Oncology, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario (Canada); Atenafu, Eshetu G. [Department of Biostatistics, University Health Network, University of Toronto, Toronto, Ontario (Canada); Myrehaug, Sten; Soliman, Hany; Lee, Young K. [Department of Radiation Oncology, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario (Canada); Ebrahimi, Hamid [Orthopaedic Biomechanics Laboratory, Sunnybrook Research Institute, Department of Surgery, University of Toronto, Toronto, Ontario (Canada); Yee, Albert J.M. [Division of Orthopaedic Surgery, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario (Canada); Sahgal, Arjun, E-mail: arjun.sahgal@sunnybrook.ca [Department of Radiation Oncology, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario (Canada)

    2017-01-01

    Purpose: To determine a threshold of vertebral body (VB) osteolytic or osteoblastic tumor involvement that would predict vertebral compression fracture (VCF) risk after stereotactic body radiation therapy (SBRT), using volumetric image-segmentation software. Methods and Materials: A computational semiautomated skeletal metastasis segmentation process refined in our laboratory was applied to the pretreatment planning CT scan of 100 vertebral segments in 55 patients treated with spine SBRT. Each VB was segmented and the percentage of lytic and/or blastic disease by volume determined. Results: The cumulative incidence of VCF at 3 and 12 months was 14.1% and 17.3%, respectively. The median follow-up was 7.3 months (range, 0.6-67.6 months). In all, 56% of segments were determined lytic, 23% blastic, and 21% mixed, according to clinical radiologic determination. Within these 3 clinical cohorts, the segmentation-determined mean percentages of lytic and blastic tumor were 8.9% and 6.0%, 0.2% and 26.9%, and 3.4% and 15.8% by volume, respectively. On the basis of the entire cohort (n=100), a significant association was observed for the osteolytic percentage measures and the occurrence of VCF (P<.001) but not for the osteoblastic measures. The most significant lytic disease threshold was observed at ≥11.6% (odds ratio 37.4, 95% confidence interval 9.4-148.9). On multivariable analysis, ≥11.6% lytic disease (P<.001), baseline VCF (P<.001), and SBRT with ≥20 Gy per fraction (P=.014) were predictive. Conclusions: Pretreatment lytic VB disease volumetric measures, independent of the blastic component, predict for SBRT-induced VCF. Larger-scale trials evaluating our software are planned to validate the results.

  18. Quantifying loopy network architectures.

    Directory of Open Access Journals (Sweden)

    Eleni Katifori

    Full Text Available Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes from the metric topology (connectivity and edge weight and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.

  19. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    Science.gov (United States)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over

  20. GPScheDVS: A New Paradigm of the Autonomous CPU Speed Control for Commodity-OS-based General-Purpose Mobile Computers with a DVS-friendly Task Scheduling

    OpenAIRE

    Kim, Sookyoung

    2008-01-01

    This dissertation studies the problem of increasing battery life-time and reducing CPU heat dissipation without degrading system performance in commodity-OS-based general-purpose (GP) mobile computers using the dynamic voltage scaling (DVS) function of modern CPUs. The dissertation especially focuses on the impact of task scheduling on the effectiveness of DVS in achieving this goal. The task scheduling mechanism used in most contemporary general-purpose operating systems (GPOS) prioritizes t...

  1. A computational linguistic measure of clustering behavior on semantic verbal fluency task predicts risk of future dementia in the nun study.

    Science.gov (United States)

    Pakhomov, Serguei V S; Hemmy, Laura S

    2014-06-01

    Generative semantic verbal fluency (SVF) tests show early and disproportionate decline relative to other abilities in individuals developing Alzheimer's disease. Optimal performance on SVF tests depends on the efficiency of using clustered organization of semantically related items and the ability to switch between clusters. Traditional approaches to clustering and switching have relied on manual determination of clusters. We evaluated a novel automated computational linguistic approach for quantifying clustering behavior. Our approach is based on Latent Semantic Analysis (LSA) for computing strength of semantic relatedness between pairs of words produced in response to SVF test. The mean size of semantic clusters (MCS) and semantic chains (MChS) are calculated based on pairwise relatedness values between words. We evaluated the predictive validity of these measures on a set of 239 participants in the Nun Study, a longitudinal study of aging. All were cognitively intact at baseline assessment, measured with the Consortium to Establish a Registry for Alzheimer's Disease (CERAD) battery, and were followed in 18-month waves for up to 20 years. The onset of either dementia or memory impairment were used as outcomes in Cox proportional hazards models adjusted for age and education and censored at follow-up waves 5 (6.3 years) and 13 (16.96 years). Higher MCS was associated with 38% reduction in dementia risk at wave 5 and 26% reduction at wave 13, but not with the onset of memory impairment. Higher [+1 standard deviation (SD)] MChS was associated with 39% dementia risk reduction at wave 5 but not wave 13, and association with memory impairment was not significant. Higher traditional SVF scores were associated with 22-29% memory impairment and 35-40% dementia risk reduction. SVF scores were not correlated with either MCS or MChS. Our study suggests that an automated approach to measuring clustering behavior can be used to estimate dementia risk in cognitively normal

  2. Five-Year-Olds’ Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection: A Computational Modeling Study

    Science.gov (United States)

    Arslan, Burcu; Taatgen, Niels A.; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false belief tasks by constructing two computational cognitive models of this process: an instance-based learning model and a reinforcement learning model. Unlike the reinforcement learning model, the instance-based learning model predicted that children who fail second-order false belief tasks would give answers based on first-order theory of mind (ToM) reasoning as opposed to zero-order reasoning. This prediction was confirmed with an empirical study that we conducted with 72 5- to 6-year-old children. The results showed that 17% of the answers were correct and 83% of the answers were wrong. In line with our prediction, 65% of the wrong answers were based on a first-order ToM strategy, while only 29% of them were based on a zero-order strategy (the remaining 6% of subjects did not provide any answer). Based on our instance-based learning model, we propose that when children get feedback “Wrong,” they explicitly revise their strategy to a higher level instead of implicitly selecting one of the available ToM strategies. Moreover, we predict that children’s failures are due to lack of experience and that with exposure to second-order false belief reasoning, children can revise their wrong first-order reasoning strategy to a correct second-order reasoning strategy. PMID:28293206

  3. Five-Year-Olds' Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection: A Computational Modeling Study.

    Science.gov (United States)

    Arslan, Burcu; Taatgen, Niels A; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false belief tasks by constructing two computational cognitive models of this process: an instance-based learning model and a reinforcement learning model. Unlike the reinforcement learning model, the instance-based learning model predicted that children who fail second-order false belief tasks would give answers based on first-order theory of mind (ToM) reasoning as opposed to zero-order reasoning. This prediction was confirmed with an empirical study that we conducted with 72 5- to 6-year-old children. The results showed that 17% of the answers were correct and 83% of the answers were wrong. In line with our prediction, 65% of the wrong answers were based on a first-order ToM strategy, while only 29% of them were based on a zero-order strategy (the remaining 6% of subjects did not provide any answer). Based on our instance-based learning model, we propose that when children get feedback "Wrong," they explicitly revise their strategy to a higher level instead of implicitly selecting one of the available ToM strategies. Moreover, we predict that children's failures are due to lack of experience and that with exposure to second-order false belief reasoning, children can revise their wrong first-order reasoning strategy to a correct second-order reasoning strategy.

  4. Quantifiers for quantum logic

    OpenAIRE

    Heunen, Chris

    2008-01-01

    We consider categorical logic on the category of Hilbert spaces. More generally, in fact, any pre-Hilbert category suffices. We characterise closed subobjects, and prove that they form orthomodular lattices. This shows that quantum logic is just an incarnation of categorical logic, enabling us to establish an existential quantifier for quantum logic, and conclude that there cannot be a universal quantifier.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  6. Use of redundant sets of landmark information by humans (Homo sapiens) in a goal-searching task in an open field and on a computer screen.

    Science.gov (United States)

    Sekiguchi, Katsuo; Ushitani, Tomokazu; Sawa, Kosuke

    2018-05-01

    Landmark-based goal-searching tasks that were similar to those for pigeons (Ushitani & Jitsumori, 2011) were provided to human participants to investigate whether they could learn and use multiple sources of spatial information that redundantly indicate the position of a hidden target in both an open field (Experiment 1) and on a computer screen (Experiments 2 and 3). During the training in each experiment, participants learned to locate a target in 1 of 25 objects arranged in a 5 × 5 grid, using two differently colored, arrow-shaped (Experiments 1 and 2) or asymmetrically shaped (Experiment 3) landmarks placed adjacent to the goal and pointing to the goal location. The absolute location and directions of the landmarks varied across trials, but the constant configuration of the goal and the landmarks enabled participants to find the goal using both global configural information and local vector information (pointing to the goal by each individual landmark). On subsequent test trials, the direction was changed for one of the landmarks to conflict with the global configural information. Results of Experiment 1 indicated that participants used vector information from a single landmark but not configural information. Further examinations revealed that the use of global (metric) information was enhanced remarkably by goal searching with nonarrow-shaped landmarks on the computer monitor (Experiment 3) but much less so with arrow-shaped landmarks (Experiment 2). The General Discussion focuses on a comparison between humans in the current study and pigeons in the previous study. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  8. Relationships among Individual Task Self-Efficacy, Self-Regulated Learning Strategy Use and Academic Performance in a Computer-Supported Collaborative Learning Environment

    Science.gov (United States)

    Wilson, Kimberly; Narayan, Anupama

    2016-01-01

    This study investigates relationships between self-efficacy, self-regulated learning strategy use and academic performance. Participants were 96 undergraduate students working on projects with three subtasks (idea generation task, methodical task and data collection) in a blended learning environment. Task self-efficacy was measured with…

  9. Operating and maintenance experience with computer-based systems in nuclear power plants - A report by the PWG-1 Task Group on Computer-based Systems Important to Safety

    International Nuclear Information System (INIS)

    1998-01-01

    This report was prepared by the Task Group on Computer-based Systems Important to Safety of the Principal Working Group No. 1. Canada had a leading role in this study. Operating and Maintenance Experience with Computer-based Systems in nuclear power plants is essential for improving and upgrading against potential failures. The present report summarises the observations and findings related to the use of digital technology in nuclear power plants. It also makes recommendations for future activities in Member Countries. Continued expansion of digital technology in nuclear power reactor has resulted in new safety and licensing issues, since the existing licensing review criteria were mainly based on the analogue devices used when the plants were designed. On the industry side, a consensus approach is needed to help stabilise and standardise the treatment of digital installations and upgrades while ensuring safety and reliability. On the regulatory side, new guidelines and regulatory requirements are needed to assess digital upgrades. Upgrades or new installation issues always involve potential for system failures. They are addressed specifically in the 'hazard' or 'failure' analysis, and it is in this context that they ultimately are resolved in the design and addressed in licensing. Failure Analysis is normally performed in parallel with the design, verification and validation (V and V), and implementation activities of the upgrades. Current standards and guidelines in France, U.S. and Canada recognise the importance of failure analysis in computer-based system design. Thus failure analysis is an integral part of the design and implementation process and is aimed at evaluating potential failure modes and cause of system failures. In this context, it is essential to define 'System' as the plant system affected by the upgrade, not the 'Computer' system. The identified failures would provide input to the design process in the form of design requirements or design

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  11. Task Group on Computer/Communication Protocols for Bibliographic Data Exchange. Interim Report = Groupe de Travail sur les Protocoles de Communication/Ordinateurs pour l'Exchange de Donnees Bibliographiques. Rapport d'Etape. May 1983.

    Science.gov (United States)

    Canadian Network Papers, 1983

    1983-01-01

    This preliminary report describes the work to date of the Task Group on Computer/Communication protocols for Bibliographic Data Interchange, which was formed in 1980 to develop a set of protocol standards to facilitate communication between heterogeneous library and information systems within the framework of Open Systems Interconnection (OSI). A…

  12. Assessing Changes in High School Students' Conceptual Understanding through Concept Maps before and after the Computer-Based Predict-Observe-Explain (CB-POE) Tasks on Acid-Base Chemistry at the Secondary Level

    Science.gov (United States)

    Yaman, Fatma; Ayas, Alipasa

    2015-01-01

    Although concept maps have been used as alternative assessment methods in education, there has been an ongoing debate on how to evaluate students' concept maps. This study discusses how to evaluate students' concept maps as an assessment tool before and after 15 computer-based Predict-Observe-Explain (CB-POE) tasks related to acid-base chemistry.…

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  20. Fine and Gross Motor Task Performance When Using Computer-Based Video Models by Students with Autism and Moderate Intellectual Disability

    Science.gov (United States)

    Mechling, Linda C.; Swindle, Catherine O.

    2013-01-01

    This investigation examined the effects of video modeling on the fine and gross motor task performance by three students with a diagnosis of moderate intellectual disability (Group 1) and by three students with a diagnosis of autism spectrum disorder (Group 2). Using a multiple probe design across three sets of tasks, the study examined the…

  1. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  6. Thermosensory reversal effect quantified

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kappers, A.M.L.

    2008-01-01

    At room temperature, some materials feel colder than others due to differences in thermal conductivity, heat capacity and geometry. When the ambient temperature is well above skin temperature, the roles of 'cold' and 'warm' materials are reversed. In this paper, this effect is quantified by

  7. Thermosensory reversal effect quantified

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kappers, A.M.L.

    2008-01-01

    At room temperature, some materials feel colder than others due to differences in thermal conductivity, heat capacity and geometry. When the ambient temperature is well above skin temperature, the roles of ‘cold’ and ‘warm’ materials are reversed. In this paper, this effect is quantified by

  8. Quantifying requirements volatility effects

    NARCIS (Netherlands)

    Kulk, G.P.; Verhoef, C.

    2008-01-01

    In an organization operating in the bancassurance sector we identified a low-risk IT subportfolio of 84 IT projects comprising together 16,500 function points, each project varying in size and duration, for which we were able to quantify its requirements volatility. This representative portfolio

  9. The quantified relationship

    NARCIS (Netherlands)

    Danaher, J.; Nyholm, S.R.; Earp, B.

    2018-01-01

    The growth of self-tracking and personal surveillance has given rise to the Quantified Self movement. Members of this movement seek to enhance their personal well-being, productivity, and self-actualization through the tracking and gamification of personal data. The technologies that make this

  10. Quantifying IT estimation risks

    NARCIS (Netherlands)

    Kulk, G.P.; Peters, R.J.; Verhoef, C.

    2009-01-01

    A statistical method is proposed for quantifying the impact of factors that influence the quality of the estimation of costs for IT-enabled business projects. We call these factors risk drivers as they influence the risk of the misestimation of project costs. The method can effortlessly be

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  12. Quantifying Distributional Model Risk via Optimal Transport

    OpenAIRE

    Blanchet, Jose; Murthy, Karthyek R. A.

    2016-01-01

    This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality, in particular, we provide examples involving path dependent expectations of stochastic processes. Our approach consists in computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms ...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  14. ComPLuS Model: A New Insight in Pupils' Collaborative Talk, Actions and Balance during a Computer-Mediated Music Task

    Science.gov (United States)

    Nikolaidou, Georgia N.

    2012-01-01

    This exploratory work describes and analyses the collaborative interactions that emerge during computer-based music composition in the primary school. The study draws on socio-cultural theories of learning, originated within Vygotsky's theoretical context, and proposes a new model, namely Computer-mediated Praxis and Logos under Synergy (ComPLuS).…

  15. Defining Elemental Imitation Mechanisms: A Comparison of Cognitive and Motor-Spatial Imitation Learning across Object- and Computer-Based Tasks

    Science.gov (United States)

    Subiaul, Francys; Zimmermann, Laura; Renner, Elizabeth; Schilder, Brian; Barr, Rachel

    2016-01-01

    During the first 5 years of life, the versatility, breadth, and fidelity with which children imitate change dramatically. Currently, there is no model to explain what underlies such significant changes. To that end, the present study examined whether task-independent but domain-specific--elemental--imitation mechanism explains performance across…

  16. Effectiveness of ESL Students' Performance by Computational Assessment and Role of Reading Strategies in Courseware-Implemented Business Translation Tasks

    Science.gov (United States)

    Tsai, Shu-Chiao

    2017-01-01

    This study reports on investigating students' English translation performance and their use of reading strategies in an elective English writing course offered to senior students of English as a Foreign Language for 100 minutes per week for 12 weeks. A courseware-implemented instruction combined with a task-based learning approach was adopted.…

  17. Job Management and Task Bundling

    Science.gov (United States)

    Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André

    2018-03-01

    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.

  18. Job Management and Task Bundling

    Directory of Open Access Journals (Sweden)

    Berkowitz Evan

    2018-01-01

    Full Text Available High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users’ current workflows or executables.

  19. Quantifying global exergy resources

    International Nuclear Information System (INIS)

    Hermann, Weston A.

    2006-01-01

    Exergy is used as a common currency to assess and compare the reservoirs of theoretically extractable work we call energy resources. Resources consist of matter or energy with properties different from the predominant conditions in the environment. These differences can be classified as physical, chemical, or nuclear exergy. This paper identifies the primary exergy reservoirs that supply exergy to the biosphere and quantifies the intensive and extensive exergy of their derivative secondary reservoirs, or resources. The interconnecting accumulations and flows among these reservoirs are illustrated to show the path of exergy through the terrestrial system from input to its eventual natural or anthropogenic destruction. The results are intended to assist in evaluation of current resource utilization, help guide fundamental research to enable promising new energy technologies, and provide a basis for comparing the resource potential of future energy options that is independent of technology and cost

  20. Researching participants taking IELTS Academic Writing Task 2 (AWT2) in paper mode and in computer mode in terms of score equivalence, cognitive validity and other factors

    OpenAIRE

    Chan, Sathena; Bax, Stephen; Weir, Cyril

    2017-01-01

    Computer-based (CB) assessment is becoming more common in most university disciplines, and international language testing bodies now routinely use computers for many areas of English language assessment. Given that, in the near future, IELTS also will need to move towards offering CB options alongside traditional paper-based (PB) modes, the research reported here prepares for that possibility, building on research carried out some years ago which investigated the statistical comparability of ...

  1. Quantifying the Adaptive Cycle.

    Directory of Open Access Journals (Sweden)

    David G Angeler

    Full Text Available The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011 data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  2. Quantifying Anthropogenic Dust Emissions

    Science.gov (United States)

    Webb, Nicholas P.; Pierre, Caroline

    2018-02-01

    Anthropogenic land use and land cover change, including local environmental disturbances, moderate rates of wind-driven soil erosion and dust emission. These human-dust cycle interactions impact ecosystems and agricultural production, air quality, human health, biogeochemical cycles, and climate. While the impacts of land use activities and land management on aeolian processes can be profound, the interactions are often complex and assessments of anthropogenic dust loads at all scales remain highly uncertain. Here, we critically review the drivers of anthropogenic dust emission and current evaluation approaches. We then identify and describe opportunities to: (1) develop new conceptual frameworks and interdisciplinary approaches that draw on ecological state-and-transition models to improve the accuracy and relevance of assessments of anthropogenic dust emissions; (2) improve model fidelity and capacity for change detection to quantify anthropogenic impacts on aeolian processes; and (3) enhance field research and monitoring networks to support dust model applications to evaluate the impacts of disturbance processes on local to global-scale wind erosion and dust emissions.

  3. Mobile Thread Task Manager

    Science.gov (United States)

    Clement, Bradley J.; Estlin, Tara A.; Bornstein, Benjamin J.

    2013-01-01

    The Mobile Thread Task Manager (MTTM) is being applied to parallelizing existing flight software to understand the benefits and to develop new techniques and architectural concepts for adapting software to multicore architectures. It allocates and load-balances tasks for a group of threads that migrate across processors to improve cache performance. In order to balance-load across threads, the MTTM augments a basic map-reduce strategy to draw jobs from a global queue. In a multicore processor, memory may be "homed" to the cache of a specific processor and must be accessed from that processor. The MTTB architecture wraps access to data with thread management to move threads to the home processor for that data so that the computation follows the data in an attempt to avoid L2 cache misses. Cache homing is also handled by a memory manager that translates identifiers to processor IDs where the data will be homed (according to rules defined by the user). The user can also specify the number of threads and processors separately, which is important for tuning performance for different patterns of computation and memory access. MTTM efficiently processes tasks in parallel on a multiprocessor computer. It also provides an interface to make it easier to adapt existing software to a multiprocessor environment.

  4. Integrated cosmological probes: concordance quantified

    Energy Technology Data Exchange (ETDEWEB)

    Nicola, Andrina; Amara, Adam; Refregier, Alexandre, E-mail: andrina.nicola@phys.ethz.ch, E-mail: adam.amara@phys.ethz.ch, E-mail: alexandre.refregier@phys.ethz.ch [Department of Physics, ETH Zürich, Wolfgang-Pauli-Strasse 27, CH-8093 Zürich (Switzerland)

    2017-10-01

    Assessing the consistency of parameter constraints derived from different cosmological probes is an important way to test the validity of the underlying cosmological model. In an earlier work [1], we computed constraints on cosmological parameters for ΛCDM from an integrated analysis of CMB temperature anisotropies and CMB lensing from Planck, galaxy clustering and weak lensing from SDSS, weak lensing from DES SV as well as Type Ia supernovae and Hubble parameter measurements. In this work, we extend this analysis and quantify the concordance between the derived constraints and those derived by the Planck Collaboration as well as WMAP9, SPT and ACT. As a measure for consistency, we use the Surprise statistic [2], which is based on the relative entropy. In the framework of a flat ΛCDM cosmological model, we find all data sets to be consistent with one another at a level of less than 1σ. We highlight that the relative entropy is sensitive to inconsistencies in the models that are used in different parts of the analysis. In particular, inconsistent assumptions for the neutrino mass break its invariance on the parameter choice. When consistent model assumptions are used, the data sets considered in this work all agree with each other and ΛCDM, without evidence for tensions.

  5. Crowdsourcing for quantifying transcripts: An exploratory study.

    Science.gov (United States)

    Azzam, Tarek; Harman, Elena

    2016-02-01

    This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  7. Analytical model for computing transient pressures and forces in the safety/relief valve discharge line. Mark I Containment Program, task number 7.1.2

    International Nuclear Information System (INIS)

    Wheeler, A.J.

    1978-02-01

    An analytical model is described that computes the transient pressures, velocities and forces in the safety/relief valve discharge line immediately after safety/relief valve opening. Equations of motion are defined for the gas-flow and water-flow models. Results are not only verified by comparing them with an earlier version of the model, but also with Quad Cities and Monticello plant data. The model shows reasonable agreement with the earlier model and the plant data

  8. The Fallacy of Quantifying Risk

    Science.gov (United States)

    2012-09-01

    Defense AT&L: September–October 2012 18 The Fallacy of Quantifying Risk David E. Frick, Ph.D. Frick is a 35-year veteran of the Department of...a key to risk analysis was “choosing the right technique” of quantifying risk . The weakness in this argument stems not from the assertion that one...of information about the enemy), yet achiev- ing great outcomes. Attempts at quantifying risk are not, in and of themselves, objectionable. Prudence

  9. Development and evaluation of a computer-based medical work assessment programme

    Directory of Open Access Journals (Sweden)

    Spallek Michael

    2008-12-01

    Full Text Available Abstract Background There are several ways to conduct a job task analysis in medical work environments including pencil-paper observations, interviews and questionnaires. However these methods implicate bias problems such as high inter-individual deviations and risks of misjudgement. Computer-based observation helps to reduce these problems. The aim of this paper is to give an overview of the development process of a computer-based job task analysis instrument for real-time observations to quantify the job tasks performed by physicians working in different medical settings. In addition reliability and validity data of this instrument will be demonstrated. Methods This instrument was developed in consequential steps. First, lists comprising tasks performed by physicians in different care settings were classified. Afterwards content validity of task lists was proved. After establishing the final task categories, computer software was programmed and implemented in a mobile personal computer. At least inter-observer reliability was evaluated. Two trained observers recorded simultaneously tasks of the same physician. Results Content validity of the task lists was confirmed by observations and experienced specialists of each medical area. Development process of the job task analysis instrument was completed successfully. Simultaneous records showed adequate interrater reliability. Conclusion Initial results of this analysis supported the validity and reliability of this developed method for assessing physicians' working routines as well as organizational context factors. Based on results using this method, possible improvements for health professionals' work organisation can be identified.

  10. Improving our understanding of multi-tasking in healthcare: Drawing together the cognitive psychology and healthcare literature.

    Science.gov (United States)

    Douglas, Heather E; Raban, Magdalena Z; Walter, Scott R; Westbrook, Johanna I

    2017-03-01

    Multi-tasking is an important skill for clinical work which has received limited research attention. Its impacts on clinical work are poorly understood. In contrast, there is substantial multi-tasking research in cognitive psychology, driver distraction, and human-computer interaction. This review synthesises evidence of the extent and impacts of multi-tasking on efficiency and task performance from health and non-healthcare literature, to compare and contrast approaches, identify implications for clinical work, and to develop an evidence-informed framework for guiding the measurement of multi-tasking in future healthcare studies. The results showed healthcare studies using direct observation have focused on descriptive studies to quantify concurrent multi-tasking and its frequency in different contexts, with limited study of impact. In comparison, non-healthcare studies have applied predominantly experimental and simulation designs, focusing on interleaved and concurrent multi-tasking, and testing theories of the mechanisms by which multi-tasking impacts task efficiency and performance. We propose a framework to guide the measurement of multi-tasking in clinical settings that draws together lessons from these siloed research efforts. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Quantum tasks in Minkowski space

    International Nuclear Information System (INIS)

    Kent, Adrian

    2012-01-01

    The fundamental properties of quantum information and its applications to computing and cryptography have been greatly illuminated by considering information-theoretic tasks that are provably possible or impossible within non-relativistic quantum mechanics. I describe here a general framework for defining tasks within (special) relativistic quantum theory and illustrate it with examples from relativistic quantum cryptography and relativistic distributed quantum computation. The framework gives a unified description of all tasks previously considered and also defines a large class of new questions about the properties of quantum information in relation to Minkowski causality. It offers a way of exploring interesting new fundamental tasks and applications, and also highlights the scope for a more systematic understanding of the fundamental information-theoretic properties of relativistic quantum theory. (paper)

  12. The effects of stimulus modality and task integrality: Predicting dual-task performance and workload from single-task levels

    Science.gov (United States)

    Hart, S. G.; Shively, R. J.; Vidulich, M. A.; Miller, R. C.

    1986-01-01

    The influence of stimulus modality and task difficulty on workload and performance was investigated. The goal was to quantify the cost (in terms of response time and experienced workload) incurred when essentially serial task components shared common elements (e.g., the response to one initiated the other) which could be accomplished in parallel. The experimental tasks were based on the Fittsberg paradigm; the solution to a SternBERG-type memory task determines which of two identical FITTS targets are acquired. Previous research suggested that such functionally integrated dual tasks are performed with substantially less workload and faster response times than would be predicted by suming single-task components when both are presented in the same stimulus modality (visual). The physical integration of task elements was varied (although their functional relationship remained the same) to determine whether dual-task facilitation would persist if task components were presented in different sensory modalities. Again, it was found that the cost of performing the two-stage task was considerably less than the sum of component single-task levels when both were presented visually. Less facilitation was found when task elements were presented in different sensory modalities. These results suggest the importance of distinguishing between concurrent tasks that complete for limited resources from those that beneficially share common resources when selecting the stimulus modalities for information displays.

  13. Multidominance, ellipsis, and quantifier scope

    NARCIS (Netherlands)

    Temmerman, Tanja Maria Hugo

    2012-01-01

    This dissertation provides a novel perspective on the interaction between quantifier scope and ellipsis. It presents a detailed investigation of the scopal interaction between English negative indefinites, modals, and quantified phrases in ellipsis. One of the crucial observations is that a negative

  14. Unsupervised segmentation of task activated regions in fmRI

    DEFF Research Database (Denmark)

    Røge, Rasmus; Madsen, Kristoffer Hougaard; Schmidt, Mikkel Nørgaard

    2015-01-01

    Functional Magnetic Resonance Imaging has become a central measuring modality to quantify functional activiation of the brain in both task and rest. Most analysis used to quantify functional activation requires supervised approaches as employed in statistical parametric mapping (SPM) to extract...... framework for the analysis of task fMRI and resting-state data in general where strong knowledge of how the task induces a BOLD response is missing....

  15. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  16. Respiratory sinus arrhythmia responses to cognitive tasks : effects of task factors and RSA indices

    NARCIS (Netherlands)

    Overbeek, T.; Boxtel, van Anton; Westerink, J.H.D.M.

    2014-01-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable

  17. Respiratory sinus arrhythmia responses to cognitive tasks: Effects of task factors and RSA indices

    NARCIS (Netherlands)

    Overbeek, T.J.M.; van Boxtel, A.; Westerink, J.H.D.M.

    2014-01-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable

  18. Human computer interactions in next-generation of aircraft smart navigation management systems: task analysis and architecture under an agent-oriented methodological approach.

    Science.gov (United States)

    Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B

    2015-03-04

    The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.

  19. Human Computer Interactions in Next-Generation of Aircraft Smart Navigation Management Systems: Task Analysis and Architecture under an Agent-Oriented Methodological Approach

    Science.gov (United States)

    Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.

    2015-01-01

    The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092

  20. Human Computer Interactions in Next-Generation of Aircraft Smart Navigation Management Systems: Task Analysis and Architecture under an Agent-Oriented Methodological Approach

    Directory of Open Access Journals (Sweden)

    José M. Canino-Rodríguez

    2015-03-01

    Full Text Available The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.

  1. A P300-based Brain-Computer Interface with Stimuli on Moving Objects: Four-Session Single-Trial and Triple-Trial Tests with a Game-Like Task Design

    Science.gov (United States)

    Ganin, Ilya P.; Shishkin, Sergei L.; Kaplan, Alexander Y.

    2013-01-01

    Brain-computer interfaces (BCIs) are tools for controlling computers and other devices without using muscular activity, employing user-controlled variations in signals recorded from the user’s brain. One of the most efficient noninvasive BCIs is based on the P300 wave of the brain’s response to stimuli and is therefore referred to as the P300 BCI. Many modifications of this BCI have been proposed to further improve the BCI’s characteristics or to better adapt the BCI to various applications. However, in the original P300 BCI and in all of its modifications, the spatial positions of stimuli were fixed relative to each other, which can impose constraints on designing applications controlled by this BCI. We designed and tested a P300 BCI with stimuli presented on objects that were freely moving on a screen at a speed of 5.4°/s. Healthy participants practiced a game-like task with this BCI in either single-trial or triple-trial mode within four sessions. At each step, the participants were required to select one of nine moving objects. The mean online accuracy of BCI-based selection was 81% in the triple-trial mode and 65% in the single-trial mode. A relatively high P300 amplitude was observed in response to targets in most participants. Self-rated interest in the task was high and stable over the four sessions (the medians in the 1st/4th sessions were 79/84% and 76/71% in the groups practicing in the single-trial and triple-trial modes, respectively). We conclude that the movement of stimulus positions relative to each other may not prevent the efficient use of the P300 BCI by people controlling their gaze, e.g., in robotic devices and in video games. PMID:24302977

  2. Kokkos' Task DAG Capabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Harold C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ibanez, Daniel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report documents the ASC/ATDM Kokkos deliverable "Production Portable Dy- namic Task DAG Capability." This capability enables applications to create and execute a dynamic task DAG ; a collection of heterogeneous computational tasks with a directed acyclic graph (DAG) of "execute after" dependencies where tasks and their dependencies are dynamically created and destroyed as tasks execute. The Kokkos task scheduler executes the dynamic task DAG on the target execution resource; e.g. a multicore CPU, a manycore CPU such as Intel's Knights Landing (KNL), or an NVIDIA GPU. Several major technical challenges had to be addressed during development of Kokkos' Task DAG capability: (1) portability to a GPU with it's simplified hardware and micro- runtime, (2) thread-scalable memory allocation and deallocation from a bounded pool of memory, (3) thread-scalable scheduler for dynamic task DAG, (4) usability by applications.

  3. A passive brain-computer interface application for the mental workload assessment on professional air traffic controllers during realistic air traffic control tasks.

    Science.gov (United States)

    Aricò, P; Borghini, G; Di Flumeri, G; Colosimo, A; Pozzi, S; Babiloni, F

    2016-01-01

    In the last decades, it has been a fast-growing concept in the neuroscience field. The passive brain-computer interface (p-BCI) systems allow to improve the human-machine interaction (HMI) in operational environments, by using the covert brain activity (eg, mental workload) of the operator. However, p-BCI technology could suffer from some practical issues when used outside the laboratories. In particular, one of the most important limitations is the necessity to recalibrate the p-BCI system each time before its use, to avoid a significant reduction of its reliability in the detection of the considered mental states. The objective of the proposed study was to provide an example of p-BCIs used to evaluate the users' mental workload in a real operational environment. For this purpose, through the facilities provided by the École Nationale de l'Aviation Civile of Toulouse (France), the cerebral activity of 12 professional air traffic control officers (ATCOs) has been recorded while performing high realistic air traffic management scenarios. By the analysis of the ATCOs' brain activity (electroencephalographic signal-EEG) and the subjective workload perception (instantaneous self-assessment) provided by both the examined ATCOs and external air traffic control experts, it has been possible to estimate and evaluate the variation of the mental workload under which the controllers were operating. The results showed (i) a high significant correlation between the neurophysiological and the subjective workload assessment, and (ii) a high reliability over time (up to a month) of the proposed algorithm that was also able to maintain high discrimination accuracies by using a low number of EEG electrodes (~3 EEG channels). In conclusion, the proposed methodology demonstrated the suitability of p-BCI systems in operational environments and the advantages of the neurophysiological measures with respect to the subjective ones. © 2016 Elsevier B.V. All rights reserved.

  4. Quantifiers in Russian Sign Language

    NARCIS (Netherlands)

    Kimmelman, V.; Paperno, D.; Keenan, E.L.

    2017-01-01

    After presenting some basic genetic, historical and typological information about Russian Sign Language, this chapter outlines the quantification patterns it expresses. It illustrates various semantic types of quantifiers, such as generalized existential, generalized universal, proportional,

  5. Quantified Self in de huisartsenpraktijk

    NARCIS (Netherlands)

    de Groot, Martijn; Timmers, Bart; Kooiman, Thea; van Ittersum, Miriam

    2015-01-01

    Quantified Self staat voor de zelfmetende mens. Het aantal mensen dat met zelf gegeneerde gezondheidsgegevens het zorgproces binnenwandelt gaat de komende jaren groeien. Verschillende soorten activity trackers en gezondheidsapplicaties voor de smartphone maken het relatief eenvoudig om persoonlijke

  6. Quantifying information leakage of randomized protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Legay, Axel; Malacaria, Pasquale

    2015-01-01

    The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model deterministic and probabilistic systems. By using a methodology generalizing the lattice of information approach we model refined attackers...... capable to observe the internal behavior of the system, and quantify the information leakage of such systems. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and non-timed attacks...

  7. Influence of dual-tasking with different levels of attention diversion on characteristics of the movement-related cortical potential.

    Science.gov (United States)

    Aliakbaryhosseinabadi, Susan; Kamavuako, Ernest Nlandu; Jiang, Ning; Farina, Dario; Mrachacz-Kersting, Natalie

    2017-11-01

    Dual tasking is defined as performing two tasks concurrently and has been shown to have a significant effect on attention directed to the performance of the main task. In this study, an attention diversion task with two different levels was administered while participants had to complete a cue-based motor task consisting of foot dorsiflexion. An auditory oddball task with two levels of complexity was implemented to divert the user's attention. Electroencephalographic (EEG) recordings were made from nine single channels. Event-related potentials (ERPs) confirmed that the oddball task of counting a sequence of two tones decreased the auditory P300 amplitude more than the oddball task of counting one target tone among three different tones. Pre-movement features quantified from the movement-related cortical potential (MRCP) were changed significantly between single and dual-task conditions in motor and fronto-central channels. There was a significant delay in movement detection for the case of single tone counting in two motor channels only (237.1-247.4ms). For the task of sequence counting, motor cortex and frontal channels showed a significant delay in MRCP detection (232.1-250.5ms). This study investigated the effect of attention diversion in dual-task conditions by analysing both ERPs and MRCPs in single channels. The higher attention diversion lead to a significant reduction in specific MRCP features of the motor task. These results suggest that attention division in dual-tasking situations plays an important role in movement execution and detection. This has important implications in designing real-time brain-computer interface systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Quantifying Pollutant Emissions from Office Equipment Phase IReport

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.; McKone, T.E.; Perino, C.

    2006-12-01

    Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants of interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment

  9. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  10. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    Science.gov (United States)

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  11. Respiratory sinus arrhythmia responses to cognitive tasks: effects of task factors and RSA indices.

    Science.gov (United States)

    Overbeek, Thérèse J M; van Boxtel, Anton; Westerink, Joyce H D M

    2014-05-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable results. In 83 healthy subjects, we studied RSA responses to a working memory task requiring varying levels of cognitive control and a perceptual attention task not requiring strong cognitive control. RSA responses were quantified in the time and frequency domain and were additionally corrected for differences in mean interbeat interval and respiration rate, resulting in eight different RSA indices. The two tasks were clearly differentiated by heart rate and facial EMG reference measures. Cognitive control induced inhibition of RSA whereas perceptual attention generally did not. However, the results show several differences between different RSA indices, emphasizing the importance of methodological variables. Age and sex did not influence the results. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. AUTOMATION PROGRAM FOR RECOGNITION OF ALGORITHM SOLUTION OF MATHEMATIC TASK

    Directory of Open Access Journals (Sweden)

    Denis N. Butorin

    2014-01-01

    Full Text Available In the article are been describing technology for manage of testing task in computer program. It was found for recognition of algorithm solution of mathematic task. There are been justifi ed the using hierarchical structure for a special set of testing questions. Also, there has been presented the release of the described tasks in the computer program openSEE. 

  13. AUTOMATION PROGRAM FOR RECOGNITION OF ALGORITHM SOLUTION OF MATHEMATIC TASK

    OpenAIRE

    Denis N. Butorin

    2014-01-01

    In the article are been describing technology for manage of testing task in computer program. It was found for recognition of algorithm solution of mathematic task. There are been justifi ed the using hierarchical structure for a special set of testing questions. Also, there has been presented the release of the described tasks in the computer program openSEE. 

  14. Quantifier spreading: children misled by ostensive cues

    Directory of Open Access Journals (Sweden)

    Katalin É. Kiss

    2017-04-01

    Full Text Available This paper calls attention to a methodological problem of acquisition experiments. It shows that the economy of the stimulus employed in child language experiments may lend an increased ostensive effect to the message communicated to the child. Thus, when the visual stimulus in a sentence-picture matching task is a minimal model abstracting away from the details of the situation, children often regard all the elements of the stimulus as ostensive clues to be represented in the corresponding sentence. The use of such minimal stimuli is mistaken when the experiment aims to test whether or not a certain element of the stimulus is relevant for the linguistic representation or interpretation. The paper illustrates this point by an experiment involving quantifier spreading. It is claimed that children find a universally quantified sentence like 'Every girl is riding a bicycle 'to be a false description of a picture showing three girls riding bicycles and a solo bicycle because they are misled to believe that all the elements in the visual stimulus are relevant, hence all of them are to be represented by the corresponding linguistic description. When the iconic drawings were replaced by photos taken in a natural environment rich in accidental details, the occurrence of quantifier spreading was radically reduced. It is shown that an extra object in the visual stimulus can lead to the rejection of the sentence also in the case of sentences involving no quantification, which gives further support to the claim that the source of the problem is not (or not only the grammatical or cognitive difficulty of quantification but the unintended ostensive effect of the extra object.  This article is part of the special collection: Acquisition of Quantification

  15. Quantifying Leg Movement Activity During Sleep.

    Science.gov (United States)

    Ferri, Raffaele; Fulda, Stephany

    2016-12-01

    Currently, 2 sets of similar rules for recording and scoring leg movement (LM) exist, including periodic LM during sleep (PLMS) and periodic LM during wakefulness. The former were published in 2006 by a task force of the International Restless Legs Syndrome Study Group, and the second in 2007 by the American Academy of Sleep Medicine. This article reviews the basic recording methods, scoring rules, and computer-based programs for PLMS. Less frequent LM activities, such as alternating leg muscle activation, hypnagogic foot tremor, high-frequency LMs, and excessive fragmentary myoclonus are briefly described. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Effects of damping-off caused by Rhizoctonia solani anastomosis group 2-1 on roots of wheat and oil seed rape quantified using X-ray Computed Tomography and real-time PCR

    Directory of Open Access Journals (Sweden)

    Craig J. Sturrock

    2015-06-01

    Full Text Available Rhizoctonia solani is a plant pathogenic fungus that causes significant establishment and yield losses to several important food crops globally. This is the first application of high resolution X-ray micro Computed Tomography (X-ray µCT and real-time PCR to study host-pathogen interactions in situ and elucidate the mechanism of Rhizoctonia damping-off disease over a 6-day period caused by R. solani, anastomosis group (AG 2-1 in wheat (Triticum aestivum cv. Gallant and oil seed rape (OSR, Brassica napus cv. Marinka. Temporal, non-destructive analysis of root system architectures was performed using RooTrak and validated by the destructive method of root washing. Disease was assessed visually and related to pathogen DNA quantification in soil using real-time PCR. R. solani AG2-1 at similar initial DNA concentrations in soil was capable of causing significant damage to the developing root systems of both wheat and OSR. Disease caused reductions in primary root number, root volume, root surface area and convex hull which were affected less in the monocotyledonous host. Wheat was more tolerant to the pathogen, exhibited fewer symptoms and developed more complex root system. In contrast, R. solani caused earlier damage and maceration of the taproot of the dicot, OSR. Disease severity was related to pathogen DNA accumulation in soil only for OSR, however reductions in root traits were significantly associated with both disease and pathogen DNA. The method offers the first steps in advancing current understanding of soil-borne pathogen behaviour in situ at the pore scale, which may lead to the development of mitigation measures to combat disease influence in the field.

  17. Satellite Tasking via a Tablet Computer

    Science.gov (United States)

    2015-09-01

    A A Bv velocity vector of point B with respect to frame A sH angular momentum vector sH rate of change of angular momentum vector...externalT external torque vector ω spacecraft angular velocity vector ω spacecraft angular acceleration vector h total CMG momentum...and Google’s Android uses Java . Moreover, an application developed for one operating system cannot be used on the other. There was a choice between

  18. Quantifier spreading in child eye movements: A case of the Russian quantifier kazhdyj ‘every'

    Directory of Open Access Journals (Sweden)

    Irina A. Sekerina

    2017-07-01

    Full Text Available Extensive cross-linguistic work has documented that children up to the age of 9–10 make errors when performing a sentence-picture verification task that pairs spoken sentences with the universal quantifier 'every 'and pictures with entities in partial one-to-one correspondence. These errors stem from children’s difficulties in restricting the domain of a universal quantifier to the appropriate noun phrase and are referred in the literature as 'quantifier-spreading '('q'-spreading. We adapted the task to be performed in conjunction with eye-movement recordings using the Visual World Paradigm. Russian-speaking 5-to-6-year-old children ('N '= 31 listened to sentences like 'Kazhdyj alligator lezhit v vanne '‘Every alligator is lying in a bathtub’ and viewed pictures with three alligators, each in a bathtub, and two extra empty bathtubs. Non-spreader children ('N '= 12 were adult-like in their accuracy whereas 'q'-spreading ones ('N '= 19 were only 43% correct in interpreting such sentences compared to the control sentences. Eye movements of 'q'-spreading children revealed that more looks to the extra containers (two empty bathtubs correlated with higher error rates reflecting the processing pattern of 'q'-spreading. In contrast, more looks to the distractors in control sentences did not lead to errors in interpretation. We argue that 'q'-spreading errors are caused by interference from the extra entities in the visual context, and our results support the processing difficulty account of acquisition of quantification. Interference results in cognitive overload as children have to integrate multiple sources of information, i.e., visual context with salient extra entities and the spoken sentence in which these entities are mentioned in real-time processing.   This article is part of the special collection: Acquisition of Quantification

  19. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  20. Rules and more rules: the effects of multiple tasks, extensive training, and aging on task-switching performance.

    Science.gov (United States)

    Buchler, Norbou G; Hoyer, William J; Cerella, John

    2008-06-01

    Task-switching performance was assessed in young and older adults as a function of the number of task sets to be actively maintained in memory (varied from 1 to 4) over the course of extended training (5 days). Each of the four tasks required the execution of a simple computational algorithm, which was instantaneously cued by the color of the two-digit stimulus. Tasks were presented in pure (task set size 1) and mixed blocks (task set sizes 2, 3, 4), and the task sequence was unpredictable. By considering task switching beyond two tasks, we found evidence for a cognitive control system that is not overwhelmed by task set size load manipulations. Extended training eliminated age effects in task-switching performance, even when the participants had to manage the execution of up to four tasks. The results are discussed in terms of current theories of cognitive control, including task set inertia and production system postulates.

  1. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  2. Quantifying and simulating human sensation

    DEFF Research Database (Denmark)

    Quantifying and simulating human sensation – relating science and technology of indoor climate research Abstract In his doctoral thesis from 1970 civil engineer Povl Ole Fanger proposed that the understanding of indoor climate should focus on the comfort of the individual rather than averaged...... this understanding of human sensation was adjusted to technology. I will look into the construction of the equipment, what it measures and the relationship between theory, equipment and tradition....

  3. Quantifying emissions from spontaneous combustion

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-09-01

    Spontaneous combustion can be a significant problem in the coal industry, not only due to the obvious safety hazard and the potential loss of valuable assets, but also with respect to the release of gaseous pollutants, especially CO2, from uncontrolled coal fires. This report reviews methodologies for measuring emissions from spontaneous combustion and discusses methods for quantifying, estimating and accounting for the purpose of preparing emission inventories.

  4. Quantifying and mapping spatial variability in simulated forest plots

    Science.gov (United States)

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  5. Cognitive task analysis

    NARCIS (Netherlands)

    Schraagen, J.M.C.

    2000-01-01

    Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the

  6. Convolutional neural networks and face recognition task

    Science.gov (United States)

    Sochenkova, A.; Sochenkov, I.; Makovetskii, A.; Vokhmintsev, A.; Melnikov, A.

    2017-09-01

    Computer vision tasks are remaining very important for the last couple of years. One of the most complicated problems in computer vision is face recognition that could be used in security systems to provide safety and to identify person among the others. There is a variety of different approaches to solve this task, but there is still no universal solution that would give adequate results in some cases. Current paper presents following approach. Firstly, we extract an area containing face, then we use Canny edge detector. On the next stage we use convolutional neural networks (CNN) to finally solve face recognition and person identification task.

  7. Planning and task management in Parkinson's disease: differential emphasis in dual-task performance.

    Science.gov (United States)

    Bialystok, Ellen; Craik, Fergus I M; Stefurak, Taresa

    2008-03-01

    Seventeen patients diagnosed with Parkinson's disease completed a complex computer-based task that involved planning and management while also performing an attention-demanding secondary task. The tasks were performed concurrently, but it was necessary to switch from one to the other. Performance was compared to a group of healthy age-matched control participants and a group of young participants. Parkinson's patients performed better than the age-matched controls on almost all measures and as well as the young controls in many cases. However, the Parkinson's patients achieved this by paying relatively less attention to the secondary task and focusing attention more on the primary task. Thus, Parkinson's patients can apparently improve their performance on some aspects of a multidimensional task by simplifying task demands. This benefit may occur as a consequence of their inflexible exaggerated attention to some aspects of a complex task to the relative neglect of other aspects.

  8. Single-Task and Dual-Task Gait Among Collegiate Athletes of Different Sport Classifications: Implications for Concussion Management.

    Science.gov (United States)

    Howell, David R; Oldham, Jessie R; DiFabio, Melissa; Vallabhajosula, Srikant; Hall, Eric E; Ketcham, Caroline J; Meehan, William P; Buckley, Thomas A

    2017-02-01

    Gait impairments have been documented following sport-related concussion. Whether preexisting gait pattern differences exist among athletes who participate in different sport classifications, however, remains unclear. Dual-task gait examinations probe the simultaneous performance of everyday tasks (ie, walking and thinking), and can quantify gait performance using inertial sensors. The purpose of this study was to compare the single-task and dual-task gait performance of collision/contact and noncontact athletes. A group of collegiate athletes (n = 265) were tested before their season at 3 institutions (mean age= 19.1 ± 1.1 years). All participants stood still (single-task standing) and walked while simultaneously completing a cognitive test (dual-task gait), and completed walking trials without the cognitive test (single-task gait). Spatial-temporal gait parameters were compared between collision/contact and noncontact athletes using MANCOVAs; cognitive task performance was compared using ANCOVAs. No significant single-task or dual-task gait differences were found between collision/contact and noncontact athletes. Noncontact athletes demonstrated higher cognitive task accuracy during single-task standing (P = .001) and dual-task gait conditions (P = .02) than collision/contact athletes. These data demonstrate the utility of a dual-task gait assessment outside of a laboratory and suggest that preinjury cognitive task performance during dual-tasks may differ between athletes of different sport classifications.

  9. Quantifying Quantum-Mechanical Processes.

    Science.gov (United States)

    Hsieh, Jen-Hsiang; Chen, Shih-Hsuan; Li, Che-Ming

    2017-10-19

    The act of describing how a physical process changes a system is the basis for understanding observed phenomena. For quantum-mechanical processes in particular, the affect of processes on quantum states profoundly advances our knowledge of the natural world, from understanding counter-intuitive concepts to the development of wholly quantum-mechanical technology. Here, we show that quantum-mechanical processes can be quantified using a generic classical-process model through which any classical strategies of mimicry can be ruled out. We demonstrate the success of this formalism using fundamental processes postulated in quantum mechanics, the dynamics of open quantum systems, quantum-information processing, the fusion of entangled photon pairs, and the energy transfer in a photosynthetic pigment-protein complex. Since our framework does not depend on any specifics of the states being processed, it reveals a new class of correlations in the hierarchy between entanglement and Einstein-Podolsky-Rosen steering and paves the way for the elaboration of a generic method for quantifying physical processes.

  10. Task-focused modeling in automated agriculture

    Science.gov (United States)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  11. Data Used in Quantified Reliability Models

    Science.gov (United States)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  12. Task demand, task management, and teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Braarud, Per Oeivind; Brendryen, Haavar

    2001-03-15

    The current approach to mental workload assessment in process control was evaluated in 3 previous HAMMLAB studies, by analysing the relationship between workload related measures and performance. The results showed that subjective task complexity rating was related to team's control room performance, that mental effort (NASA-TLX) was weakly related to performance, and that overall activity level was unrelated to performance. The results support the argument that general cognitive measures, i.e., mental workload, are weakly related to performance in the process control domain. This implies that other workload concepts than general mental workload are needed for valid assessment of human reliability and for valid assessment of control room configurations. An assessment of task load in process control suggested that how effort is used to handle task demand is more important then the level of effort invested to solve the task. The report suggests two main workload related concepts with a potential as performance predictors in process control: task requirements, and the work style describing how effort is invested to solve the task. The task requirements are seen as composed of individual task demand and team demand. In a similar way work style are seen as composed of individual task management and teamwork style. A framework for the development of the concepts is suggested based on a literature review and experiences from HAMMLAB research. It is suggested that operational definitions of workload concepts should be based on observable control room behaviour, to assure a potential for developing performance-shaping factors. Finally an explorative analysis of teamwork measures and performance in one study indicated that teamwork concepts are related to performance. This lends support to the suggested development of team demand and teamwork style as elements of a framework for the analysis of workload in process control. (Author)

  13. Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track

    Science.gov (United States)

    2015-11-20

    Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track Paul N. Bennett Microsoft Research Redmond, USA pauben...anchor text graph has proven useful in the general realm of query reformulation [2], we sought to quantify the value of extracting key phrases from...anchor text in the broader setting of the task understanding track. Given a query, our approach considers a simple method for identifying a relevant

  14. Project Tasks in Robotics

    DEFF Research Database (Denmark)

    Sørensen, Torben; Hansen, Poul Erik

    1998-01-01

    Description of the compulsary project tasks to be carried out as a part of DTU course 72238 Robotics......Description of the compulsary project tasks to be carried out as a part of DTU course 72238 Robotics...

  15. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  16. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  17. Objective threshold for distinguishing complicated tasks

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    Estimating the likelihood of human error in a reliable manner is really important for enhancing the safety of a large process control system such as Nuclear Power Plants (NPPs). In this regard, from the point of view of Probabilistic Safety Assessment (PSA), various kinds of Human Reliability Analysis (HRA) methods have been used for several decades in order to systematically evaluate the effect of human error on the safety of NPPs. However, one of the recurrence issues is to determine the level of an important Performance Shaping Factor (PSF) by using a clear and objective manner with respect to the context of a given task. Unfortunately, there is no such criterion for a certain PSF such as the complexity of a task. For this reason, in this study, an objective criterion that is helpful for identifying a complicated task is suggested based on the Task Complexity (TACOM) measure. To this end, subjective difficulty scores rated by high speed train drivers are collected. After that, subjective difficulty scores are compared with the associated TACOM scores being quantified based on tasks to be conducted by high speed train drivers. As a result, it is expected that high speed train drivers feel a significant difficulty when they are faced with tasks of which the TACOM scores are greater than 4.2. Since TACOM measure is a kind of general tool to quantify the complexity of tasks to be done by human operators, it is promising to conclude that this value can be regarded as a common threshold representing what a complicated task is.

  18. Task assignment and coaching

    OpenAIRE

    Dominguez-Martinez, S.

    2009-01-01

    An important task of a manager is to motivate her subordinates. One way in which a manager can give incentives to junior employees is through the assignment of tasks. How a manager allocates tasks in an organization, provides information to the junior employees about his ability. Without coaching from a manager, the junior employee only has information about his past performance. Based on his past performance, a talented junior who has performed a difficult task sometimes decides to leave the...

  19. Functional Task Test (FTT)

    Science.gov (United States)

    Bloomberg, Jacob J.; Mulavara, Ajitkumar; Peters, Brian T.; Rescheke, Millard F.; Wood, Scott; Lawrence, Emily; Koffman, Igor; Ploutz-Snyder, Lori; Spiering, Barry A.; Feeback, Daniel L.; hide

    2009-01-01

    This slide presentation reviews the Functional Task Test (FTT), an interdisciplinary testing regimen that has been developed to evaluate astronaut postflight functional performance and related physiological changes. The objectives of the project are: (1) to develop a set of functional tasks that represent critical mission tasks for the Constellation Program, (2) determine the ability to perform these tasks after space flight, (3) Identify the key physiological factors that contribute to functional decrements and (4) Use this information to develop targeted countermeasures.

  20. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  1. Task assignment and coaching

    NARCIS (Netherlands)

    Dominguez-Martinez, S.

    2009-01-01

    An important task of a manager is to motivate her subordinates. One way in which a manager can give incentives to junior employees is through the assignment of tasks. How a manager allocates tasks in an organization, provides information to the junior employees about his ability. Without coaching

  2. Discovery of high-level tasks in the operating room

    NARCIS (Netherlands)

    Bouarfa, L.; Jonker, P.P.; Dankelman, J.

    2010-01-01

    Recognizing and understanding surgical high-level tasks from sensor readings is important for surgical workflow analysis. Surgical high-level task recognition is also a challenging task in ubiquitous computing because of the inherent uncertainty of sensor data and the complexity of the operating

  3. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  4. Quantifying Evaporation in a Permeable Pavement System

    Science.gov (United States)

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  5. Quantifying sound quality in loudspeaker reproduction

    NARCIS (Netherlands)

    Beerends, John G.; van Nieuwenhuizen, Kevin; van den Broek, E.L.

    2016-01-01

    We present PREQUEL: Perceptual Reproduction Quality Evaluation for Loudspeakers. Instead of quantifying the loudspeaker system itself, PREQUEL quantifies the overall loudspeakers' perceived sound quality by assessing their acoustic output using a set of music signals. This approach introduces a

  6. A formalization of computational trust

    NARCIS (Netherlands)

    Güven - Ozcelebi, C.; Holenderski, M.J.; Ozcelebi, T.; Lukkien, J.J.

    2018-01-01

    Computational trust aims to quantify trust and is studied by many disciplines including computer science, social sciences and business science. We propose a formal computational trust model, including its parameters and operations on these parameters, as well as a step by step guide to compute trust

  7. Quantifier Scope in Categorical Compositional Distributional Semantics

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Sadrzadeh

    2016-08-01

    Full Text Available In previous work with J. Hedges, we formalised a generalised quantifiers theory of natural language in categorical compositional distributional semantics with the help of bialgebras. In this paper, we show how quantifier scope ambiguity can be represented in that setting and how this representation can be generalised to branching quantifiers.

  8. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  9. A Heuristic Task Scheduling Algorithm for Heterogeneous Virtual Clusters

    OpenAIRE

    Weiwei Lin; Wentai Wu; James Z. Wang

    2016-01-01

    Cloud computing provides on-demand computing and storage services with high performance and high scalability. However, the rising energy consumption of cloud data centers has become a prominent problem. In this paper, we first introduce an energy-aware framework for task scheduling in virtual clusters. The framework consists of a task resource requirements prediction module, an energy estimate module, and a scheduler with a task buffer. Secondly, based on this framework, we propose a virtual ...

  10. Quantifying the vitamin D economy.

    Science.gov (United States)

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy. © The Author(s) 2014. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Quantify the complexity of turbulence

    Science.gov (United States)

    Tao, Xingtian; Wu, Huixuan

    2017-11-01

    Many researchers have used Reynolds stress, power spectrum and Shannon entropy to characterize a turbulent flow, but few of them have measured the complexity of turbulence. Yet as this study shows, conventional turbulence statistics and Shannon entropy have limits when quantifying the flow complexity. Thus, it is necessary to introduce new complexity measures- such as topology complexity and excess information-to describe turbulence. Our test flow is a classic turbulent cylinder wake at Reynolds number 8100. Along the stream-wise direction, the flow becomes more isotropic and the magnitudes of normal Reynolds stresses decrease monotonically. These seem to indicate the flow dynamics becomes simpler downstream. However, the Shannon entropy keeps increasing along the flow direction and the dynamics seems to be more complex, because the large-scale vortices cascade to small eddies, the flow is less correlated and more unpredictable. In fact, these two contradictory observations partially describe the complexity of a turbulent wake. Our measurements (up to 40 diameters downstream the cylinder) show that the flow's degree-of-complexity actually increases firstly and then becomes a constant (or drops slightly) along the stream-wise direction. University of Kansas General Research Fund.

  12. Quantifying Cancer Risk from Radiation.

    Science.gov (United States)

    Keil, Alexander P; Richardson, David B

    2017-12-06

    Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.

  13. Quantifying China's regional economic complexity

    Science.gov (United States)

    Gao, Jian; Zhou, Tao

    2018-02-01

    China has experienced an outstanding economic expansion during the past decades, however, literature on non-monetary metrics that reveal the status of China's regional economic development are still lacking. In this paper, we fill this gap by quantifying the economic complexity of China's provinces through analyzing 25 years' firm data. First, we estimate the regional economic complexity index (ECI), and show that the overall time evolution of provinces' ECI is relatively stable and slow. Then, after linking ECI to the economic development and the income inequality, we find that the explanatory power of ECI is positive for the former but negative for the latter. Next, we compare different measures of economic diversity and explore their relationships with monetary macroeconomic indicators. Results show that the ECI index and the non-linear iteration based Fitness index are comparative, and they both have stronger explanatory power than other benchmark measures. Further multivariate regressions suggest the robustness of our results after controlling other socioeconomic factors. Our work moves forward a step towards better understanding China's regional economic development and non-monetary macroeconomic indicators.

  14. Quantifying and Reducing Light Pollution

    Science.gov (United States)

    Gokhale, Vayujeet; Caples, David; Goins, Jordan; Herdman, Ashley; Pankey, Steven; Wren, Emily

    2018-06-01

    We describe the current level of light pollution in and around Kirksville, Missouri and around Anderson Mesa near Flagstaff, Arizona. We quantify the amount of light that is projected up towards the sky, instead of the ground, using Unihedron sky quality meters installed at various locations. We also present results from DSLR photometry of several standard stars, and compare the photometric quality of the data collected at locations with varying levels of light pollution. Presently, light fixture shields and ‘warm-colored’ lights are being installed on Truman State University’s campus in order to reduce light pollution. We discuss the experimental procedure we use to test the effectiveness of the different light fixtures shields in a controlled setting inside the Del and Norma Robison Planetarium.Apart from negatively affecting the quality of the night sky for astronomers, light pollution adversely affects migratory patterns of some animals and sleep-patterns in humans, increases our carbon footprint, and wastes resources and money. This problem threatens to get particularly acute with the increasing use of outdoor LED lamps. We conclude with a call to action to all professional and amateur astronomers to act against the growing nuisance of light pollution.

  15. Quantifying meniscal kinematics in dogs.

    Science.gov (United States)

    Park, Brian H; Banks, Scott A; Pozzi, Antonio

    2017-11-06

    The dog has been used extensively as an experimental model to study meniscal treatments such as meniscectomy, meniscal repair, transplantation, and regeneration. However, there is very little information on meniscal kinematics in the dog. This study used MR imaging to quantify in vitro meniscal kinematics in loaded dog knees in four distinct poses: extension, flexion, internal, and external rotation. A new method was used to track the meniscal poses along the convex and posteriorly tilted tibial plateau. Meniscal displacements were large, displacing 13.5 and 13.7 mm posteriorly on average for the lateral and medial menisci during flexion (p = 0.90). The medial anterior horn and lateral posterior horns were the most mobile structures, showing average translations of 15.9 and 15.1 mm, respectively. Canine menisci are highly mobile and exhibit movements that correlate closely with the relative tibiofemoral positions. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  16. Quantifying the invasiveness of species

    Directory of Open Access Journals (Sweden)

    Robert Colautti

    2014-04-01

    Full Text Available The success of invasive species has been explained by two contrasting but non-exclusive views: (i intrinsic factors make some species inherently good invaders; (ii species become invasive as a result of extrinsic ecological and genetic influences such as release from natural enemies, hybridization or other novel ecological and evolutionary interactions. These viewpoints are rarely distinguished but hinge on distinct mechanisms leading to different management scenarios. To improve tests of these hypotheses of invasion success we introduce a simple mathematical framework to quantify the invasiveness of species along two axes: (i interspecific differences in performance among native and introduced species within a region, and (ii intraspecific differences between populations of a species in its native and introduced ranges. Applying these equations to a sample dataset of occurrences of 1,416 plant species across Europe, Argentina, and South Africa, we found that many species are common in their native range but become rare following introduction; only a few introduced species become more common. Biogeographical factors limiting spread (e.g. biotic resistance, time of invasion therefore appear more common than those promoting invasion (e.g. enemy release. Invasiveness, as measured by occurrence data, is better explained by inter-specific variation in invasion potential than biogeographical changes in performance. We discuss how applying these comparisons to more detailed performance data would improve hypothesis testing in invasion biology and potentially lead to more efficient management strategies.

  17. Cross-linguistic patterns in the acquisition of quantifiers

    Science.gov (United States)

    Cummins, Chris; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K.; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-01-01

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier’s specific meaning. We investigate competence with the expressions for “all,” “none,” “some,” “some…not,” and “most” in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation. PMID:27482119

  18. The Urban Forest Effects (UFORE) model: quantifying urban forest structure and functions

    Science.gov (United States)

    David J. Nowak; Daniel E. Crane

    2000-01-01

    The Urban Forest Effects (UFORE) computer model was developed to help managers and researchers quantify urban forest structure and functions. The model quantifies species composition and diversity, diameter distribution, tree density and health, leaf area, leaf biomass, and other structural characteristics; hourly volatile organic compound emissions (emissions that...

  19. Quantifying the Cumulative Impact of Differences in Care on Prostate Cancer Outcomes

    National Research Council Canada - National Science Library

    Fesinmeyer, Megan

    2007-01-01

    ... the continuum of care contribute to disparity. The second layer of this proposal is the development of a computer model that integrates the complex patterns of care and differences by race identified in the first phase in order to quantify...

  20. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    Science.gov (United States)

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  1. Robot Task Commander with Extensible Programming Environment

    Science.gov (United States)

    Hart, Stephen W (Inventor); Yamokoski, John D. (Inventor); Wightman, Brian J (Inventor); Dinh, Duy Paul (Inventor); Gooding, Dustin R (Inventor)

    2014-01-01

    A system for developing distributed robot application-level software includes a robot having an associated control module which controls motion of the robot in response to a commanded task, and a robot task commander (RTC) in networked communication with the control module over a network transport layer (NTL). The RTC includes a script engine(s) and a GUI, with a processor and a centralized library of library blocks constructed from an interpretive computer programming code and having input and output connections. The GUI provides access to a Visual Programming Language (VPL) environment and a text editor. In executing a method, the VPL is opened, a task for the robot is built from the code library blocks, and data is assigned to input and output connections identifying input and output data for each block. A task sequence(s) is sent to the control module(s) over the NTL to command execution of the task.

  2. Neural basis for generalized quantifier comprehension.

    Science.gov (United States)

    McMillan, Corey T; Clark, Robin; Moore, Peachie; Devita, Christian; Grossman, Murray

    2005-01-01

    Generalized quantifiers like "all cars" are semantically well understood, yet we know little about their neural representation. Our model of quantifier processing includes a numerosity device, operations that combine number elements and working memory. Semantic theory posits two types of quantifiers: first-order quantifiers identify a number state (e.g. "at least 3") and higher-order quantifiers additionally require maintaining a number state actively in working memory for comparison with another state (e.g. "less than half"). We used BOLD fMRI to test the hypothesis that all quantifiers recruit inferior parietal cortex associated with numerosity, while only higher-order quantifiers recruit prefrontal cortex associated with executive resources like working memory. Our findings showed that first-order and higher-order quantifiers both recruit right inferior parietal cortex, suggesting that a numerosity component contributes to quantifier comprehension. Moreover, only probes of higher-order quantifiers recruited right dorsolateral prefrontal cortex, suggesting involvement of executive resources like working memory. We also observed activation of thalamus and anterior cingulate that may be associated with selective attention. Our findings are consistent with a large-scale neural network centered in frontal and parietal cortex that supports comprehension of generalized quantifiers.

  3. Quantifying collective attention from tweet stream.

    Directory of Open Access Journals (Sweden)

    Kazutoshi Sasahara

    Full Text Available Online social media are increasingly facilitating our social interactions, thereby making available a massive "digital fossil" of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of "collective attention" on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or "tweets." Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. "Retweet" networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era.

  4. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  5. Quantifying uncertainties in the structural response of SSME blades

    Science.gov (United States)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  6. Quantifying and Mapping Global Data Poverty.

    Science.gov (United States)

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  7. Quantifying and Mapping Global Data Poverty.

    Directory of Open Access Journals (Sweden)

    Mathias Leidig

    Full Text Available Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI. The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  8. Integrating human and machine intelligence in galaxy morphology classification tasks

    Science.gov (United States)

    Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl

    2018-06-01

    Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.

  9. Transport Task Force Leadership, Task 4

    International Nuclear Information System (INIS)

    Callen, J.D.

    1991-07-01

    The Transport Task Force (TTF) was initiated as a broad-based US magnetic fusion community activity during the fall of 1988 to focus attention on and encourage development of an increased understanding of anomalous transport in tokamaks. The overall TTF goal is to make progress on Characterizing, Understanding and Identifying how to Reduce plasma transport in tokamaks -- to CUIR transport

  10. "Photographing money" task pricing

    Science.gov (United States)

    Jia, Zhongxiang

    2018-05-01

    "Photographing money" [1]is a self-service model under the mobile Internet. The task pricing is reasonable, related to the success of the commodity inspection. First of all, we analyzed the position of the mission and the membership, and introduced the factor of membership density, considering the influence of the number of members around the mission on the pricing. Multivariate regression of task location and membership density using MATLAB to establish the mathematical model of task pricing. At the same time, we can see from the life experience that membership reputation and the intensity of the task will also affect the pricing, and the data of the task success point is more reliable. Therefore, the successful point of the task is selected, and its reputation, task density, membership density and Multiple regression of task positions, according to which a nhew task pricing program. Finally, an objective evaluation is given of the advantages and disadvantages of the established model and solution method, and the improved method is pointed out.

  11. Board Task Performance

    DEFF Research Database (Denmark)

    Minichilli, Alessandro; Zattoni, Alessandro; Nielsen, Sabina

    2012-01-01

    identify three board processes as micro-level determinants of board effectiveness. Specifically, we focus on effort norms, cognitive conflicts and the use of knowledge and skills as determinants of board control and advisory task performance. Further, we consider how two different institutional settings....... The findings show that: (i) Board processes have a larger potential than demographic variables to explain board task performance; (ii) board task performance differs significantly between boards operating in different contexts; and (iii) national context moderates the relationships between board processes...... and board task performance....

  12. Entropy generation method to quantify thermal comfort

    Science.gov (United States)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study

  13. Quantum computing with trapped ions

    International Nuclear Information System (INIS)

    Haeffner, H.; Roos, C.F.; Blatt, R.

    2008-01-01

    Quantum computers hold the promise of solving certain computational tasks much more efficiently than classical computers. We review recent experimental advances towards a quantum computer with trapped ions. In particular, various implementations of qubits, quantum gates and some key experiments are discussed. Furthermore, we review some implementations of quantum algorithms such as a deterministic teleportation of quantum information and an error correction scheme

  14. Task based synthesis of serial manipulators

    Directory of Open Access Journals (Sweden)

    Sarosh Patel

    2015-05-01

    Full Text Available Computing the optimal geometric structure of manipulators is one of the most intricate problems in contemporary robot kinematics. Robotic manipulators are designed and built to perform certain predetermined tasks. There is a very close relationship between the structure of the manipulator and its kinematic performance. It is therefore important to incorporate such task requirements during the design and synthesis of the robotic manipulators. Such task requirements and performance constraints can be specified in terms of the required end-effector positions, orientations and velocities along the task trajectory. In this work, we present a comprehensive method to develop the optimal geometric structure (DH parameters of a non-redundant six degree of freedom serial manipulator from task descriptions. In this work we define, develop and test a methodology to design optimal manipulator configurations based on task descriptions. This methodology is devised to investigate all possible manipulator configurations that can satisfy the task performance requirements under imposed joint constraints. Out of all the possible structures, the structures that can reach all the task points with the required orientations are selected. Next, these candidate structures are tested to see whether they can attain end-effector velocities in arbitrary directions within the user defined joint constraints, so that they can deliver the best kinematic performance. Additionally least power consuming configurations are also identified.

  15. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  16. Quantifying chaos for ecological stoichiometry.

    Science.gov (United States)

    Duarte, Jorge; Januário, Cristina; Martins, Nuno; Sardanyés, Josep

    2010-09-01

    The theory of ecological stoichiometry considers ecological interactions among species with different chemical compositions. Both experimental and theoretical investigations have shown the importance of species composition in the outcome of the population dynamics. A recent study of a theoretical three-species food chain model considering stoichiometry [B. Deng and I. Loladze, Chaos 17, 033108 (2007)] shows that coexistence between two consumers predating on the same prey is possible via chaos. In this work we study the topological and dynamical measures of the chaotic attractors found in such a model under ecological relevant parameters. By using the theory of symbolic dynamics, we first compute the topological entropy associated with unimodal Poincaré return maps obtained by Deng and Loladze from a dimension reduction. With this measure we numerically prove chaotic competitive coexistence, which is characterized by positive topological entropy and positive Lyapunov exponents, achieved when the first predator reduces its maximum growth rate, as happens at increasing δ1. However, for higher values of δ1 the dynamics become again stable due to an asymmetric bubble-like bifurcation scenario. We also show that a decrease in the efficiency of the predator sensitive to prey's quality (increasing parameter ζ) stabilizes the dynamics. Finally, we estimate the fractal dimension of the chaotic attractors for the stoichiometric ecological model.

  17. Supporting complex search tasks

    DEFF Research Database (Denmark)

    Gäde, Maria; Hall, Mark; Huurdeman, Hugo

    2015-01-01

    , is fragmented at best. The workshop addressed the many open research questions: What are the obvious use cases and applications of complex search? What are essential features of work tasks and search tasks to take into account? And how do these evolve over time? With a multitude of information, varying from...

  18. Task leaders reports

    International Nuclear Information System (INIS)

    Loriaux, E.F.; Jehee, J.N.T.

    1995-01-01

    Report on CRP-OSS Task 4.1.1. ''Survey of existing documentation relevant to this programme's goals'' and report on CRP-OSS Task 4.1.2. ''Survey of existing Operator Support Systems and the experience with them'' are presented. 2 tabs

  19. Strategic Adaptation to Task Characteristics, Incentives, and Individual Differences in Dual-Tasking.

    Directory of Open Access Journals (Sweden)

    Christian P Janssen

    Full Text Available We investigate how good people are at multitasking by comparing behavior to a prediction of the optimal strategy for dividing attention between two concurrent tasks. In our experiment, 24 participants had to interleave entering digits on a keyboard with controlling a randomly moving cursor with a joystick. The difficulty of the tracking task was systematically varied as a within-subjects factor. Participants were also exposed to different explicit reward functions that varied the relative importance of the tracking task relative to the typing task (between-subjects. Results demonstrate that these changes in task characteristics and monetary incentives, together with individual differences in typing ability, influenced how participants choose to interleave tasks. This change in strategy then affected their performance on each task. A computational cognitive model was used to predict performance for a wide set of alternative strategies for how participants might have possibly interleaved tasks. This allowed for predictions of optimal performance to be derived, given the constraints placed on performance by the task and cognition. A comparison of human behavior with the predicted optimal strategy shows that participants behaved near optimally. Our findings have implications for the design and evaluation of technology for multitasking situations, as consideration should be given to the characteristics of the task, but also to how different users might use technology depending on their individual characteristics and their priorities.

  20. Pointing Device Performance in Steering Tasks.

    Science.gov (United States)

    Senanayake, Ransalu; Goonetilleke, Ravindra S

    2016-06-01

    Use of touch-screen-based interactions is growing rapidly. Hence, knowing the maneuvering efficacy of touch screens relative to other pointing devices is of great importance in the context of graphical user interfaces. Movement time, accuracy, and user preferences of four pointing device settings were evaluated on a computer with 14 participants aged 20.1 ± 3.13 years. It was found that, depending on the difficulty of the task, the optimal settings differ for ballistic and visual control tasks. With a touch screen, resting the arm increased movement time for steering tasks. When both performance and comfort are considered, whether to use a mouse or a touch screen for person-computer interaction depends on the steering difficulty. Hence, a input device should be chosen based on the application, and should be optimized to match the graphical user interface. © The Author(s) 2016.

  1. Quantifying Sentiment and Influence in Blogspaces

    Energy Technology Data Exchange (ETDEWEB)

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  2. Quantifying motion for pancreatic radiotherapy margin calculation

    International Nuclear Information System (INIS)

    Whitfield, Gillian; Jain, Pooja; Green, Melanie; Watkins, Gillian; Henry, Ann; Stratford, Julie; Amer, Ali; Marchant, Thomas; Moore, Christopher; Price, Patricia

    2012-01-01

    Background and purpose: Pancreatic radiotherapy (RT) is limited by uncertain target motion. We quantified 3D patient/organ motion during pancreatic RT and calculated required treatment margins. Materials and methods: Cone-beam computed tomography (CBCT) and orthogonal fluoroscopy images were acquired post-RT delivery from 13 patients with locally advanced pancreatic cancer. Bony setup errors were calculated from CBCT. Inter- and intra-fraction fiducial (clip/seed/stent) motion was determined from CBCT projections and orthogonal fluoroscopy. Results: Using an off-line CBCT correction protocol, systematic (random) setup errors were 2.4 (3.2), 2.0 (1.7) and 3.2 (3.6) mm laterally (left–right), vertically (anterior–posterior) and longitudinally (cranio-caudal), respectively. Fiducial motion varied substantially. Random inter-fractional changes in mean fiducial position were 2.0, 1.6 and 2.6 mm; 95% of intra-fractional peak-to-peak fiducial motion was up to 6.7, 10.1 and 20.6 mm, respectively. Calculated clinical to planning target volume (CTV–PTV) margins were 1.4 cm laterally, 1.4 cm vertically and 3.0 cm longitudinally for 3D conformal RT, reduced to 0.9, 1.0 and 1.8 cm, respectively, if using 4D planning and online setup correction. Conclusions: Commonly used CTV–PTV margins may inadequately account for target motion during pancreatic RT. Our results indicate better immobilisation, individualised allowance for respiratory motion, online setup error correction and 4D planning would improve targeting.

  3. Information criteria for quantifying loss of reversibility in parallelized KMC

    Energy Technology Data Exchange (ETDEWEB)

    Gourgoulias, Konstantinos, E-mail: gourgoul@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Rey-Bellet, Luc, E-mail: luc@math.umass.edu

    2017-01-01

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot be computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.

  4. Quantifying forecast quality of IT business value

    NARCIS (Netherlands)

    Eveleens, J.L.; van der Pas, M.; Verhoef, C.

    2012-01-01

    This article discusses how to quantify the forecasting quality of IT business value. We address a common economic indicator often used to determine the business value of project proposals, the Net Present Value (NPV). To quantify the forecasting quality of IT business value, we develop a generalized

  5. A Task-driven Grammar Refactoring Algorithm

    Directory of Open Access Journals (Sweden)

    Ivan Halupka

    2012-01-01

    Full Text Available This paper presents our proposal and the implementation of an algorithm for automated refactoring of context-free grammars. Rather than operating under some domain-specific task, in our approach refactoring is perfomed on the basis of a refactoring task defined by its user. The algorithm and the corresponding refactoring system are called mARTINICA. mARTINICA is able to refactor grammars of arbitrary size and structural complexity. However, the computation time needed to perform a refactoring task with the desired outcome is highly dependent on the size of the grammar. Until now, we have successfully performed refactoring tasks on small and medium-size grammars of Pascal-like languages and parts of the Algol-60 programming language grammar. This paper also briefly introduces the reader to processes occurring in grammar refactoring, a method for describing desired properties that a refactored grammar should fulfill, and there is a discussion of the overall significance of grammar refactoring.

  6. Generic cognitive adaptations to task interference in task switching

    NARCIS (Netherlands)

    Poljac, E.; Bekkering, H.

    2009-01-01

    The present study investigated how the activation of previous tasks interferes with the execution of future tasks as a result of temporal manipulations. Color and shape matching tasks were organized in runs of two trials each. The tasks were specified by a cue presented before a task run, cueing

  7. Energy Efficient Task Light

    DEFF Research Database (Denmark)

    Logadottir, Asta; Ardkapan, Siamak Rahimi; Johnsen, Kjeld

    2014-01-01

    The objectives of this work is to develop a task light for office lighting that fulfils the minimum requirements of the European standard EN12464 - 1 : Light and lighting – Lighting of work places, Part 1: Indoor workplaces and the Danish standard DS 700 : Lys og belysning I arbejdsrum , or more...... specifically the requirements that apply to the work area and the immediate surrounding area. By providing a task light that fulfils the requirements for task lighting and the immediate surrounding area, the general lighting only needs to provide the illuminance levels required for background lighting...... and thereby a reduction in installed power for general lighting of about 40 % compared to the way illuminance levels are designed in an office environment in Denmark today. This lighting strategy is useful when the placement of the task area is not defined in the space before the lighting is design ed...

  8. Trunk sway analysis to quantify the warm-up phenomenon in myotonia congenita patients.

    NARCIS (Netherlands)

    Horlings, G.C.; Drost, G.; Bloem, B.R.; Trip, J.; Pieterse, A.J.; Engelen, B.G.M. van; Allum, J.H.J.

    2009-01-01

    OBJECTIVE: Patients with autosomal recessive myotonia congenita display myotonia and transient paresis that diminish with repetitive muscle contractions (warm-up phenomenon). A new approach is presented to quantify this warm-up phenomenon under clinically relevant gait and balance tasks. METHODS:

  9. Review on Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    P. Sumithra

    2017-03-01

    Full Text Available Computational electromagnetics (CEM is applied to model the interaction of electromagnetic fields with the objects like antenna, waveguides, aircraft and their environment using Maxwell equations.  In this paper the strength and weakness of various computational electromagnetic techniques are discussed. Performance of various techniques in terms accuracy, memory and computational time for application specific tasks such as modeling RCS (Radar cross section, space applications, thin wires, antenna arrays are presented in this paper.

  10. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  11. Using frequency tagging to quantify attentional deployment in a visual divided attention task

    NARCIS (Netherlands)

    Toffanin, Paolo; de Jong, Ritske; Johnson, Addie; Martens, Sander

    Frequency tagging is an EEG method based on the quantification of the steady state visual evoked potential (SSVEP) elicited from stimuli which flicker with a distinctive frequency. Because the amplitude of the SSVEP is modulated by attention such that attended stimuli elicit higher SSVEP amplitudes

  12. Selecting Tasks for Evaluating Human Performance as a Function of Gravity

    Science.gov (United States)

    Norcross, Jason R.; Gernhardt, Michael L.

    2011-01-01

    A challenge in understanding human performance as a function of gravity is determining which tasks to research. Initial studies began with treadmill walking, which was easy to quantify and control. However, with the development of pressurized rovers, it is less important to optimize human performance for ambulation as pressurized rovers will likely perform gross translation for them. Future crews are likely to spend much of their extravehicular activity (EVA) performing geology, construction,a nd maintenance type tasks. With these types of tasks, people have different performance strategies, and it is often difficult to quantify the task and measure steady-state metabolic rates or perform biomechanical analysis. For many of these types of tasks, subjective feedback may be the only data that can be collected. However, subjective data may not fully support a rigorous scientific comparison of human performance across different gravity levels and suit factors. NASA would benefit from having a wide variety of quantifiable tasks that allow human performance comparison across different conditions. In order to determine which tasks will effectively support scientific studies, many different tasks and data analysis techniques will need to be employed. Many of these tasks and techniques will not be effective, but some will produce quantifiable results that are sensitive enough to show performance differences. One of the primary concerns related to EVA performance is metabolic rate. The higher the metabolic rate, the faster the astronaut will exhaust consumables. The focus of this poster will be on how different tasks affect metabolic rate across different gravity levels.

  13. A complex network approach to cloud computing

    International Nuclear Information System (INIS)

    Travieso, Gonzalo; Ruggiero, Carlos Antônio; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2016-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the clients’ tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlaid by Erdős–Rényi (ER) and Barabási-Albert (BA) topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of the cost of communication between the client and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter, the ER topology provides better performance than the BA for smaller average degrees and opposite behaviour for larger average degrees. With respect to cost, smaller values are found in the BA topology irrespective of the average degree. In addition, we also verified that it is easier to find good servers in ER than in BA networks. Surprisingly, balance and cost are not too much affected by the presence of communities. However, for a well-defined community network, we found that it is important to assign each server to a different community so as to achieve better performance. (paper: interdisciplinary statistical mechanics )

  14. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  15. Contaminated sediment research task: SHC Task 3.61.3

    Science.gov (United States)

    A poster presentation for the SHC BOSC review will summarize the research efforts under Sustainable and Healthy Communities Research Program (SHC) in the Contaminated Sediment Task within the Contaminated Sites Project. For the Task, Problem Summary & Decision Context; Task O...

  16. Bare quantifier fronting as contrastive topicalization

    Directory of Open Access Journals (Sweden)

    Ion Giurgea

    2015-11-01

    Full Text Available I argue that indefinites (in particular bare quantifiers such as ‘something’, ‘somebody’, etc. which are neither existentially presupposed nor in the restriction of a quantifier over situations, can undergo topicalization in a number of Romance languages (Catalan, Italian, Romanian, Spanish, but only if the sentence contains “verum” focus, i.e. focus on a high degree of certainty of the sentence. I analyze these indefinites as contrastive topics, using Büring’s (1999 theory (where the term ‘S-topic’ is used for what I call ‘contrastive topic’. I propose that the topic is evaluated in relation to a scalar set including generalized quantifiers such as {lP $x P(x, lP MANYx P(x, lP MOSTx P(x, lP “xP(x} or {lP $xP(x, lP P(a, lP P(b …}, and that the contrastive topic is the weakest generalized quantifier in this set. The verum focus, which is part of the “comment” that co-occurs with the “Topic”, introduces a set of alternatives including degrees of certainty of the assertion. The speaker asserts that his claim is certainly true or highly probable, contrasting it with stronger claims for which the degree of probability is unknown. This explains the observation that in downward entailing contexts, the fronted quantified DPs are headed by ‘all’ or ‘many’, whereas ‘some’, small numbers or ‘at least n’ appear in upward entailing contexts. Unlike other cases of non-specific topics, which are property topics, these are quantifier topics: the topic part is a generalized quantifier, the comment is a property of generalized quantifiers. This explains the narrow scope of the fronted quantified DP.

  17. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  18. Task 7: ADPAC User's Manual

    Science.gov (United States)

    Hall, E. J.; Topp, D. A.; Delaney, R. A.

    1996-01-01

    The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields. The current version of the computer code resulting from this study is referred to as ADPAC (Advanced Ducted Propfan Analysis Codes-Version 7). This report is intended to serve as a computer program user's manual for the ADPAC code developed under Tasks 6 and 7 of the NASA Contract. The ADPAC program is based on a flexible multiple- block grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. An iterative implicit algorithm is available for rapid time-dependent flow calculations, and an advanced two equation turbulence model is incorporated to predict complex turbulent flows. The consolidated code generated during this study is capable of executing in either a serial or parallel computing mode from a single source code. Numerous examples are given in the form of test cases to demonstrate the utility of this approach for predicting the aerodynamics of modem turbomachinery configurations.

  19. Brain-computer interfaces increase whole-brain signal to noise.

    Science.gov (United States)

    Papageorgiou, T Dorina; Lisinski, Jonathan M; McHenry, Monica A; White, Jason P; LaConte, Stephen M

    2013-08-13

    Brain-computer interfaces (BCIs) can convert mental states into signals to drive real-world devices, but it is not known if a given covert task is the same when performed with and without BCI-based control. Using a BCI likely involves additional cognitive processes, such as multitasking, attention, and conflict monitoring. In addition, it is challenging to measure the quality of covert task performance. We used whole-brain classifier-based real-time functional MRI to address these issues, because the method provides both classifier-based maps to examine the neural requirements of BCI and classification accuracy to quantify the quality of task performance. Subjects performed a covert counting task at fast and slow rates to control a visual interface. Compared with the same task when viewing but not controlling the interface, we observed that being in control of a BCI improved task classification of fast and slow counting states. Additional BCI control increased subjects' whole-brain signal-to-noise ratio compared with the absence of control. The neural pattern for control consisted of a positive network comprised of dorsal parietal and frontal regions and the anterior insula of the right hemisphere as well as an expansive negative network of regions. These findings suggest that real-time functional MRI can serve as a platform for exploring information processing and frontoparietal and insula network-based regulation of whole-brain task signal-to-noise ratio.

  20. A Reverse Stroop Task with Mouse Tracking

    Science.gov (United States)

    Yamamoto, Naohide; Incera, Sara; McLennan, Conor T.

    2016-01-01

    In a reverse Stroop task, observers respond to the meaning of a color word irrespective of the color in which the word is printed—for example, the word red may be printed in the congruent color (red), an incongruent color (e.g., blue), or a neutral color (e.g., white). Although reading of color words in this task is often thought to be neither facilitated by congruent print colors nor interfered with incongruent print colors, this interference has been detected by using a response method that does not give any bias in favor of processing of word meanings or processing of print colors. On the other hand, evidence for the presence of facilitation in this task has been scarce, even though this facilitation is theoretically possible. By modifying the task such that participants respond to a stimulus color word by pointing to a corresponding response word on a computer screen with a mouse, the present study investigated the possibility that not only interference but also facilitation would take place in a reverse Stroop task. Importantly, in this study, participants’ responses were dynamically tracked by recording the entire trajectories of the mouse. Arguably, this method provided richer information about participants’ performance than traditional measures such as reaction time and accuracy, allowing for more detailed (and thus potentially more sensitive) investigation of facilitation and interference in the reverse Stroop task. These trajectories showed that the mouse’s approach toward correct response words was significantly delayed by incongruent print colors but not affected by congruent print colors, demonstrating that only interference, not facilitation, was present in the current task. Implications of these findings are discussed within a theoretical framework in which the strength of association between a task and its response method plays a critical role in determining how word meanings and print colors interact in reverse Stroop tasks. PMID:27199881

  1. Comparison of multi-objective evolutionary approaches for task ...

    Indian Academy of Sciences (India)

    evaluated using standard metrics. Experimental results and performance measures infer that NSGA-II produces quality schedules compared to NSPSO. ...... J 2005 Framework for task scheduling in heterogeneous distributed computing using.

  2. Quantify Risk to Manage Cost and Schedule

    National Research Council Canada - National Science Library

    Raymond, Fred

    1999-01-01

    Too many projects suffer from unachievable budget and schedule goals, caused by unrealistic estimates and the failure to quantify and communicate the uncertainty of these estimates to managers and sponsoring executives...

  3. Quantifying drug-protein binding in vivo

    International Nuclear Information System (INIS)

    Buchholz, B; Bench, G; Keating III, G; Palmblad, M; Vogel, J; Grant, P G; Hillegonds, D

    2004-01-01

    Accelerator mass spectrometry (AMS) provides precise quantitation of isotope labeled compounds that are bound to biological macromolecules such as DNA or proteins. The sensitivity is high enough to allow for sub-pharmacological (''micro-'') dosing to determine macromolecular targets without inducing toxicities or altering the system under study, whether it is healthy or diseased. We demonstrated an application of AMS in quantifying the physiologic effects of one dosed chemical compound upon the binding level of another compound in vivo at sub-toxic doses [4].We are using tissues left from this study to develop protocols for quantifying specific binding to isolated and identified proteins. We also developed a new technique to quantify nanogram to milligram amounts of isolated protein at precisions that are comparable to those for quantifying the bound compound by AMS

  4. New frontiers of quantified self 3

    DEFF Research Database (Denmark)

    Rapp, Amon; Cena, Federica; Kay, Judy

    2017-01-01

    Quantified Self (QS) field needs to start thinking of how situated needs may affect the use of self-tracking technologies. In this workshop we will focus on the idiosyncrasies of specific categories of users....

  5. Task Switching in a Hierarchical Task Structure: Evidence for the Fragility of the Task Repetition Benefit

    Science.gov (United States)

    Lien, Mei-Ching; Ruthruff, Eric

    2004-01-01

    This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms.…

  6. Task uncertainty can account for mixing and switch costs in task-switching.

    Directory of Open Access Journals (Sweden)

    Patrick S Cooper

    Full Text Available Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate, particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment.

  7. Task Uncertainty Can Account for Mixing and Switch Costs in Task-Switching

    Science.gov (United States)

    Rennie, Jaime L.

    2015-01-01

    Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate), particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment. PMID:26107646

  8. Validating the appropriateness of TACOM measure: Comparing TACOM scores with subjective workload scores quantified by NASA-TLX technique

    International Nuclear Information System (INIS)

    Park, J.; Jung, W.

    2006-01-01

    In this study, the appropriateness of the task complexity (TACOM) measure that can quantify the complexity of emergency tasks was investigated by comparing subjective workload scores with the associated TACOM scores. To this end, based on the NASA-TLX (task load index) technique, 18 operators were asked to subjectively estimate perceived workload for 23 emergency tasks that were specified in the emergency operating procedures of the reference nuclear power plants. As the result of comparisons, it was observed that subjective workload scores increase in proportion to the increase of TACOM scores. Therefore, it is expect that the TACOM measure can be used as a serviceable method to quantify the complexity of emergency tasks. (authors)

  9. Validating the appropriateness of TACOM measure: Comparing TACOM scores with subjective workload scores quantified by NASA-TLX technique

    Energy Technology Data Exchange (ETDEWEB)

    Park, J.; Jung, W. [Integrated Safety Assessment Div., Korea Atomic Energy Research Inst., P.O.Box 105, Duckjin-Dong, Yusong-Ku, Taejon, 305-600 (Korea, Republic of)

    2006-07-01

    In this study, the appropriateness of the task complexity (TACOM) measure that can quantify the complexity of emergency tasks was investigated by comparing subjective workload scores with the associated TACOM scores. To this end, based on the NASA-TLX (task load index) technique, 18 operators were asked to subjectively estimate perceived workload for 23 emergency tasks that were specified in the emergency operating procedures of the reference nuclear power plants. As the result of comparisons, it was observed that subjective workload scores increase in proportion to the increase of TACOM scores. Therefore, it is expect that the TACOM measure can be used as a serviceable method to quantify the complexity of emergency tasks. (authors)

  10. Collaborative drawing with interactive table in physics: Groups’ regulation and task interpretation

    NARCIS (Netherlands)

    Mykkanen, A.; Gijlers, Aaltje H.; Jarvenoja, H.; Jarvela, S.; Bollen, Lars

    2015-01-01

    This study explores the relationship between secondary school students’ (N=36, nine groups) group members’ task interpretation and individual and group level regulation during collaborative computer- supported drawing task. Furthermore, it investigates how these factors are related to students

  11. Adaptive Cost-Based Task Scheduling in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Mohammed A. S. Mosleh

    2016-01-01

    Full Text Available Task execution in cloud computing requires obtaining stored data from remote data centers. Though this storage process reduces the memory constraints of the user’s computer, the time deadline is a serious concern. In this paper, Adaptive Cost-based Task Scheduling (ACTS is proposed to provide data access to the virtual machines (VMs within the deadline without increasing the cost. ACTS considers the data access completion time for selecting the cost effective path to access the data. To allocate data access paths, the data access completion time is computed by considering the mean and variance of the network service time and the arrival rate of network input/output requests. Then the task priority is assigned to the removed tasks based data access time. Finally, the cost of data paths are analyzed and allocated based on the task priority. Minimum cost path is allocated to the low priority tasks and fast access path are allocated to high priority tasks as to meet the time deadline. Thus efficient task scheduling can be achieved by using ACTS. The experimental results conducted in terms of execution time, computation cost, communication cost, bandwidth, and CPU utilization prove that the proposed algorithm provides better performance than the state-of-the-art methods.

  12. 49 CFR 236.1043 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... classroom, simulator, computer-based, hands-on, or other formally structured training designed to impart the... 49 Transportation 4 2010-10-01 2010-10-01 false Task analysis and basic requirements. 236.1043...

  13. The data acquisition tasks for TASSO. User manual

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1981-07-01

    The TASSO Data Acquisition System is designed to accept events from various trigger sources at rates of several Hz and to transfer these events to the central DESY IBM Triplex for later offline analysis. A NORD 10S computer is used for this purpose with several data acquisition tasks running under control of a Supervisor task DASSO that is described elsewhere. (author)

  14. Robot task space analyzer

    International Nuclear Information System (INIS)

    Hamel, W.R.; Osborn, J.

    1997-01-01

    Many nuclear projects such as environmental restoration and waste management challenges involve radiation or other hazards that will necessitate the use of remote operations that protect human workers from dangerous exposures. Remote work is far more costly to execute than what workers could accomplish directly with conventional tools and practices because task operations are slow and tedious due to difficulties of remote manipulation and viewing. Decades of experience within the nuclear remote operations community show that remote tasks may take hundreds of times longer than hands-on work; even with state-of-the-art force- reflecting manipulators and television viewing, remote task performance execution is five to ten times slower than equivalent direct contact work. Thus the requirement to work remotely is a major cost driver in many projects. Modest improvements in the work efficiency of remote systems can have high payoffs by reducing the completion time of projects. Additional benefits will accrue from improved work quality and enhanced safety

  15. Performing Task Integration

    DEFF Research Database (Denmark)

    Elkjaer, Bente; Nickelsen, Niels Christian Mossfeldt

    by shared goals and knowledge as well as mutual respect and frequent, timely, accurate and problem-solving ways of communication with the purpose of dealing with the tasks at hand in an integrated way. We introduce and discuss relational coordination theory through a case-study within public healthcare....... Here cross-professional coordination of work was done by scheduled communication twice a day. When we proposed a way for further integration of tasks through an all-inclusive team organization, we were met with resistance. We use the study to discuss whether relational coordination theory is able to do...... away with differences regarding task definitions and working conditions as well as professional knowledge hierarchies and responsibilities for parts and wholes....

  16. Quantifying the impacts of global disasters

    Science.gov (United States)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the

  17. Impact of task design on task performance and injury risk: case study of a simulated drilling task.

    Science.gov (United States)

    Alabdulkarim, Saad; Nussbaum, Maury A; Rashedi, Ehsan; Kim, Sunwook; Agnew, Michael; Gardner, Richard

    2017-06-01

    Existing evidence is limited regarding the influence of task design on performance and ergonomic risk, or the association between these two outcomes. In a controlled experiment, we constructed a mock fuselage to simulate a drilling task common in aircraft manufacturing, and examined the effect of three levels of workstation adjustability on performance as measured by productivity (e.g. fuselage completion time) and quality (e.g. fuselage defective holes), and ergonomic risk as quantified using two common methods (rapid upper limb assessment and the strain index). The primary finding was that both productivity and quality significantly improved with increased adjustability, yet this occurred only when that adjustability succeeded in reducing ergonomic risk. Supporting the inverse association between ergonomic risk and performance, the condition with highest adjustability created the lowest ergonomic risk and the best performance while there was not a substantial difference in ergonomic risk between the other two conditions, in which performance was also comparable. Practitioner Summary: Findings of this study supported a causal relationship between task design and both ergonomic risk and performance, and that ergonomic risk and performance are inversely associated. While future work is needed under more realistic conditions and a broader population, these results may be useful for task (re)design and to help cost-justify some ergonomic interventions.

  18. Organizing Core Tasks

    DEFF Research Database (Denmark)

    Boll, Karen

    has remained much the same within the last 10 years. However, how the core task has been organized has changed considerable under the influence of various “organizing devices”. The paper focusses on how organizing devices such as risk assessment, output-focus, effect orientation, and treatment...... projects influence the organization of core tasks within the tax administration. The paper shows that the organizational transformations based on the use of these devices have had consequences both for the overall collection of revenue and for the employees’ feeling of “making a difference”. All in all...

  19. Real-time scheduling of software tasks

    International Nuclear Information System (INIS)

    Hoff, L.T.

    1995-01-01

    When designing real-time systems, it is often desirable to schedule execution of software tasks based on the occurrence of events. The events may be clock ticks, interrupts from a hardware device, or software signals from other software tasks. If the nature of the events, is well understood, this scheduling is normally a static part of the system design. If the nature of the events is not completely understood, or is expected to change over time, it may be necessary to provide a mechanism for adjusting the scheduling of the software tasks. RHIC front-end computers (FECs) provide such a mechanism. The goals in designing this mechanism were to be as independent as possible of the underlying operating system, to allow for future expansion of the mechanism to handle new types of events, and to allow easy configuration. Some considerations which steered the design were programming paradigm (object oriented vs. procedural), programming language, and whether events are merely interesting moments in time, or whether they intrinsically have data associated with them. The design also needed to address performance and robustness tradeoffs involving shared task contexts, task priorities, and use of interrupt service routine (ISR) contexts vs. task contexts. This paper will explore these considerations and tradeoffs

  20. The influences of task repetition, napping, time of day, and instruction on the Sustained Attention to Response Task

    NARCIS (Netherlands)

    Schie, M.K.M. van; Alblas, E.E.; Thijs, R.D.; Fronczek, R.; Lammers, G.J.; Dijk, J.G. van

    2014-01-01

    Introduction: The Sustained Attention to Response Task (SART) helps to quantify vigilance impairments. Previous studies, in which five SART sessions on one day were administered, demonstrated worse performance during the first session than during the others. The present study comprises two

  1. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective

    Directory of Open Access Journals (Sweden)

    Arthur M. Jacobs

    2017-12-01

    Full Text Available In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  2. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective.

    Science.gov (United States)

    Jacobs, Arthur M

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  3. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  4. The Influence of Parkinson’s Disease Motor Symptom Asymmetry on Hand Performance: An Examination of the Grooved Pegboard Task

    Directory of Open Access Journals (Sweden)

    Sara M. Scharoun

    2015-01-01

    Full Text Available This study examined the influence of motor symptom asymmetry in Parkinson’s disease (PD on Grooved Pegboard (GP performance in right-handed participants. The Unified Parkinson’s Disease Rating Scale was used to assess motor symptoms and separate participants with PD into two groups (right-arm affected, left-arm affected for comparison with a group of healthy older adults. Participants completed the place and replace GP tasks two times with both hands. Laterality quotients were computed to quantify performance differences between the two hands. Comparisons among the three groups indicated that when the nonpreferred hand is affected by PD motor symptoms, superior preferred hand performance (as seen in healthy older adults is further exaggerated in tasks that require precision (i.e., place task. Regardless of the task, when the preferred hand is affected, there is an evident shift to superior left-hand performance, which may inevitably manifest as a switch in hand preference. Results add to the discussion of the relationship between handedness and motor symptom asymmetry in PD.

  5. Data Center Tasking.

    Science.gov (United States)

    Temares, M. Lewis; Lutheran, Joseph A.

    Operations tasking for data center management is discussed. The original and revised organizational structures of the data center at the University of Miami are also described. The organizational strategy addresses the functions that should be performed by the data center, anticipates the specialized skills required, and addresses personnel…

  6. Biomedical applications engineering tasks

    Science.gov (United States)

    Laenger, C. J., Sr.

    1976-01-01

    The engineering tasks performed in response to needs articulated by clinicians are described. Initial contacts were made with these clinician-technology requestors by the Southwest Research Institute NASA Biomedical Applications Team. The basic purpose of the program was to effectively transfer aerospace technology into functional hardware to solve real biomedical problems.

  7. India's Unfinished Telecom Tasks

    Indian Academy of Sciences (India)

    India's Telecom Story is now well known · Indian Operators become an enviable force · At the same time · India Amongst the Leaders · Unfinished Tasks as Operators · LightGSM ON: Innovation for Rural Area from Midas · Broadband Access Options for India · Broadband driven by DSL: still too slow · Is Wireless the answer?

  8. A study on quantification of the information flow and effectiveness of information aids for diagnosis tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    2004-02-01

    terms of time-to-task completion. As a study on human performance model, this paper evaluates the effectiveness of information aiding types based on the operator strategies for diagnosis tasks at NPPs, especially for fault identification. To evaluate the effectiveness, four aiding types which are available for supporting MCR operator's diagnosis were selected for the experiment: no aid, alarm, hypothesis with certainty factor, and hypothesis with certainty factor + expected symptom patterns. The main features of the aiding types were elicited from typical direct operator support systems for MCR operators. An experiment was conducted for 24 graduate students and subject performances were analyzed according to the strategies the subjects used in problem-solving. The experimental results show that the effect of the information aiding types on subject performance can be changed according to subject strategy. Finally, as the analysis of operator support system for MCR operators, this paper suggests a method for the quantitative evaluation of NPP decision support systems (DSSs). In this approach, the dynamic aspects of DSSs are first defined. Then, the hierarchical structure of the evaluation criteria for dynamic aspects of DSS is provided. For quantitative evaluation, the relative weights of the criteria are computed using analytic hierarchy process (AHP) to gain and aggregate the priority of the components. The criteria at the lowest level are quantified by simple numerical expressions and questionnaires which are developed to describe the characteristics of the criteria. Finally, in order to demonstrate the feasibility of this proposition, one case study is performed for the fault diagnosis module of OASYS"T"M (On-Line Operator Aid SYStem for Nuclear Power Plant), which is an operator support system developed at KAIST. In conclusion, this paper describes the practical implications in the design of MCR operator support systems

  9. From "rest" to language task: Task activation selects and prunes from broader resting-state network.

    Science.gov (United States)

    Doucet, Gaelle E; He, Xiaosong; Sperling, Michael R; Sharan, Ashwini; Tracy, Joseph I

    2017-05-01

    Resting-state networks (RSNs) show spatial patterns generally consistent with networks revealed during cognitive tasks. However, the exact degree of overlap between these networks has not been clearly quantified. Such an investigation shows promise for decoding altered functional connectivity (FC) related to abnormal language functioning in clinical populations such as temporal lobe epilepsy (TLE). In this context, we investigated the network configurations during a language task and during resting state using FC. Twenty-four healthy controls, 24 right and 24 left TLE patients completed a verb generation (VG) task and a resting-state fMRI scan. We compared the language network revealed by the VG task with three FC-based networks (seeding the left inferior frontal cortex (IFC)/Broca): two from the task (ON, OFF blocks) and one from the resting state. We found that, for both left TLE patients and controls, the RSN recruited regions bilaterally, whereas both VG-on and VG-off conditions produced more left-lateralized FC networks, matching more closely with the activated language network. TLE brings with it variability in both task-dependent and task-independent networks, reflective of atypical language organization. Overall, our findings suggest that our RSN captured bilateral activity, reflecting a set of prepotent language regions. We propose that this relationship can be best understood by the notion of pruning or winnowing down of the larger language-ready RSN to carry out specific task demands. Our data suggest that multiple types of network analyses may be needed to decode the association between language deficits and the underlying functional mechanisms altered by disease. Hum Brain Mapp 38:2540-2552, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. Quantifying the Financial Benefits of Multifamily Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Philbrick, D. [The Partnership for Advanced Residential Retrofit, Chicago, IL (United States); Scheu, R. [The Partnership for Advanced Residential Retrofit, Chicago, IL (United States); Brand, L. [The Partnership for Advanced Residential Retrofit, Chicago, IL (United States)

    2016-01-29

    Increasing the adoption of energy efficient building practices will require the energy sector to increase their understanding of the way that retrofits affect multifamily financial performance as well as how those indicators are interpreted by the lending and appraisal industries. This project analyzed building, energy, and financial program data as well as other public and private data to examine the relationship between energy efficiency retrofits and financial performance on three levels: building, city, and community. The project goals were to increase the data and analysis in the growing body of multifamily financial benefits work as well provide a framework for other geographies to produce similar characterization. The goals are accomplished through three tasks. Task one: A pre- and post-retrofit analysis of thirteen Chicago multifamily buildings. Task two: A comparison of Chicago income and expenses to two national datasets. Task three: An in-depth look at multifamily market sales data and the subsequent impact of buildings that undergo retrofits.

  11. Development of an algorithm for quantifying extremity biological tissue

    International Nuclear Information System (INIS)

    Pavan, Ana L.M.; Miranda, Jose R.A.; Pina, Diana R. de

    2013-01-01

    The computerized radiology (CR) has become the most widely used device for image acquisition and production, since its introduction in the 80s. The detection and early diagnosis, obtained via CR, are important for the successful treatment of diseases such as arthritis, metabolic bone diseases, tumors, infections and fractures. However, the standards used for optimization of these images are based on international protocols. Therefore, it is necessary to compose radiographic techniques for CR system that provides a secure medical diagnosis, with doses as low as reasonably achievable. To this end, the aim of this work is to develop a quantifier algorithm of tissue, allowing the construction of a homogeneous end used phantom to compose such techniques. It was developed a database of computed tomography images of hand and wrist of adult patients. Using the Matlab ® software, was developed a computational algorithm able to quantify the average thickness of soft tissue and bones present in the anatomical region under study, as well as the corresponding thickness in simulators materials (aluminium and lucite). This was possible through the application of mask and Gaussian removal technique of histograms. As a result, was obtained an average thickness of soft tissue of 18,97 mm and bone tissue of 6,15 mm, and their equivalents in materials simulators of 23,87 mm of acrylic and 1,07mm of aluminum. The results obtained agreed with the medium thickness of biological tissues of a patient's hand pattern, enabling the construction of an homogeneous phantom

  12. Project Scheduling Heuristics-Based Standard PSO for Task-Resource Assignment in Heterogeneous Grid

    OpenAIRE

    Chen, Ruey-Maw; Wang, Chuin-Mu

    2011-01-01

    The task scheduling problem has been widely studied for assigning resources to tasks in heterogeneous grid environment. Effective task scheduling is an important issue for the performance of grid computing. Meanwhile, the task scheduling problem is an NP-complete problem. Hence, this investigation introduces a named “standard“ particle swarm optimization (PSO) metaheuristic approach to efficiently solve the task scheduling problems in grid. Meanwhile, two promising heuristics based on multimo...

  13. Microprocessor multi-task monitor

    International Nuclear Information System (INIS)

    Ludemann, C.A.

    1983-01-01

    This paper describes a multi-task monitor program for microprocessors. Although written for the Intel 8085, it incorporates features that would be beneficial for implementation in other microprocessors used in controlling and monitoring experiments and accelerators. The monitor places permanent programs (tasks) arbitrarily located throughout ROM in a priority ordered queue. The programmer is provided with the flexibility to add new tasks or modified versions of existing tasks, without having to comply with previously defined task boundaries or having to reprogram all of ROM. Scheduling of tasks is triggered by timers, outside stimuli (interrupts), or inter-task communications. Context switching time is of the order of tenths of a milllisecond

  14. Are women better than men at multi-tasking?

    OpenAIRE

    Stoet, Gijsbert; O’Connor, Daryl B.; Conner, Mark; Laws, Keith R.

    2013-01-01

    Background: There seems to be a common belief that women are better in multi-tasking than men, but there is practically no scientific research on this topic. Here, we tested whether women have better multi-tasking skills than men.\\ensuremath\\ensuremath Methods: In Experiment 1, we compared performance of 120 women and 120 men in a computer-based task-switching paradigm. In Experiment 2, we compared a different group of 47 women and 47 men on "paper-and-pencil" multi-tasking tests.\\ensuremath\\...

  15. Experiment with expert system guidance of an engineering analysis task

    International Nuclear Information System (INIS)

    Ransom, V.H.; Fink, R.K.; Callow, R.A.

    1986-01-01

    An experiment is being conducted in which expert systems are used to guide the performance of an engineering analysis task. The task chosen for experimentation is the application of a large thermal hydraulic transient simulation computer code. The expectation from this work is that the expert system will result in an improved analytical result with a reduction in the amount of human labor and expertise required. The code associated functions of model formulation, data input, code execution, and analysis of the computed output have all been identified as candidate tasks that could benefit from the use of expert systems. Expert system modules have been built for the model building and data input task. Initial results include the observation that human expectations of an intelligent environment rapidly escalate and structured or stylized tasks that are tolerated in the unaided system are frustrating within the intelligent environment

  16. Characterizing and Mitigating Work Time Inflation in Task Parallel Programs

    Directory of Open Access Journals (Sweden)

    Stephen L. Olivier

    2013-01-01

    Full Text Available Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems. Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.

  17. Task Force report

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    The International Task Force on Prevention of Nuclear Terrorism was formed in 1985 under the auspices of the Nuclear Control Institute. This report is a consensus report of the 26 task force members - all members not necessarily agreeing on every point and all wordings, but in each case a substantial majority did agree. First, the report defines the threat, then establishes the priorities. Short-term recommendations are presented on: (1) protecting nuclear weapons; (2) protecting nuclear materials; (3) protecting nuclear facilities; (4) intelligence programs; (5) civil liberties concerns; (6) controlling nuclear transfers; (7) US - Soviet cooperation; (8) arms control initiatives; (9) convention of physical protection of nuclear material; (10) role of emergency management programs; and (11) role of the media. Brief long-term recommendations are included on (1) international measures, and (2) emerging nuclear technologies. An Appendix, Production of Nuclear Materials Usable in Weapons is presented for further consideration (without recommendations)

  18. Rostering and Task Scheduling

    DEFF Research Database (Denmark)

    Dohn, Anders Høeg

    . The rostering process is non-trivial and especially when service is required around the clock, rostering may involve considerable effort from a designated planner. Therefore, in order to minimize costs and overstaffing, to maximize the utilization of available staff, and to ensure a high level of satisfaction...... as possible to the available staff, while respecting various requirements and rules and while including possible transportation time between tasks. This thesis presents a number of industrial applications in rostering and task scheduling. The applications exist within various contexts in health care....... Mathematical and logic-based models are presented for the problems considered. Novel components are added to existing models and the modeling decisions are justified. In one case, the model is solved by a simple, but efficient greedy construction heuristic. In the remaining cases, column generation is applied...

  19. Adaptive Incremental Genetic Algorithm for Task Scheduling in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kairong Duan

    2018-05-01

    Full Text Available Cloud computing is a new commercial model that enables customers to acquire large amounts of virtual resources on demand. Resources including hardware and software can be delivered as services and measured by specific usage of storage, processing, bandwidth, etc. In Cloud computing, task scheduling is a process of mapping cloud tasks to Virtual Machines (VMs. When binding the tasks to VMs, the scheduling strategy has an important influence on the efficiency of datacenter and related energy consumption. Although many traditional scheduling algorithms have been applied in various platforms, they may not work efficiently due to the large number of user requests, the variety of computation resources and complexity of Cloud environment. In this paper, we tackle the task scheduling problem which aims to minimize makespan by Genetic Algorithm (GA. We propose an incremental GA which has adaptive probabilities of crossover and mutation. The mutation and crossover rates change according to generations and also vary between individuals. Large numbers of tasks are randomly generated to simulate various scales of task scheduling problem in Cloud environment. Based on the instance types of Amazon EC2, we implemented virtual machines with different computing capacity on CloudSim. We compared the performance of the adaptive incremental GA with that of Standard GA, Min-Min, Max-Min , Simulated Annealing and Artificial Bee Colony Algorithm in finding the optimal scheme. Experimental results show that the proposed algorithm can achieve feasible solutions which have acceptable makespan with less computation time.

  20. Computing With Quantum Mechanical Oscillators

    National Research Council Canada - National Science Library

    Parks, A

    1991-01-01

    Despite the obvious practical considerations (e.g., stability, controllability), certain quantum mechanical systems seem to naturally lend themselves in a theoretical sense to the task of performing computations...

  1. The task force process

    International Nuclear Information System (INIS)

    Applegate, J.S.

    1995-01-01

    This paper focuses on the unique aspects of the Fernald Citizens Task Force process that have contributed to a largely successful public participation effort at Fernald. The Fernald Citizens Task Force passed quickly by many procedural issues. Instead, the Task Force concentrated on (a) educating itself about the site, its problems, and possible solutions, and (b) choosing a directed way to approach its mandate: To make recommendations on several open-quotes big pictureclose quotes issues, including future use of the site, cleanup levels, waste disposition, and cleanup priorities. This paper presents the approach used at Fernald for establishing and running a focused site-specific advisory board, the key issues that have been faced, and how these issues were resolved. The success of Fernald in establishing a strong and functioning site-specific advisory board serves as a useful model for other DOE facilities, although the Fernald model is just one of many approaches that can be taken. However, the approach presented here has worked extremely well for Fernald

  2. Gap Task Force

    CERN Multimedia

    Lissuaer, D

    One of the more congested areas in the ATLAS detector is the GAP region (the area between the Barrel Calorimeter and the End Cap calorimeter) where Inner Detector services, LAr Services and some Tile services all must co-habitat in a very limited area. It has been clear for some time that the space in the GAP region is not sufficient to accommodate all that is needed. In the last few month additional problems of routing all the services to Z=0 have been encountered due to the very limited space between the Tile Calorimeter and the first layer of Muon chambers. The Technical Management Board (TMB) and the Executive Board (EB) decided in the middle of March to establish a Task Force to look at this problem and come up with a solution within well-specified guidelines. The task force consisted of experts from the ID, Muon, Liquid Argon and Tile systems in addition to experts from the Technical Coordination team and the Physics coordinator. The task force held many meetings and in general there were some very l...

  3. Task exposures in an office environment: a comparison of methods.

    Science.gov (United States)

    Van Eerd, Dwayne; Hogg-Johnson, Sheilah; Mazumder, Anjali; Cole, Donald; Wells, Richard; Moore, Anne

    2009-10-01

    Task-related factors such as frequency and duration are associated with musculoskeletal disorders in office settings. The primary objective was to compare various task recording methods as measures of exposure in an office workplace. A total of 41 workers from different jobs were recruited from a large urban newspaper (71% female, mean age 41 years SD 9.6). Questionnaire, task diaries, direct observation and video methods were used to record tasks. A common set of task codes was used across methods. Different estimates of task duration, number of tasks and task transitions arose from the different methods. Self-report methods did not consistently result in longer task duration estimates. Methodological issues could explain some of the differences in estimates seen between methods observed. It was concluded that different task recording methods result in different estimates of exposure likely due to different exposure constructs. This work addresses issues of exposure measurement in office environments. It is of relevance to ergonomists/researchers interested in how to best assess the risk of injury among office workers. The paper discusses the trade-offs between precision, accuracy and burden in the collection of computer task-based exposure measures and different underlying constructs captures in each method.

  4. Task analysis and support for problem solving tasks

    International Nuclear Information System (INIS)

    Bainbridge, L.

    1987-01-01

    This paper is concerned with Task Analysis as the basis for ergonomic design to reduce human error rates, rather than for predicting human error rates. Task Analysis techniques usually provide a set of categories for describing sub tasks, and a framework describing the relations between sub-tasks. Both the task type categories and their organisation have implications for optimum interface and training design. In this paper, the framework needed for considering the most complex tasks faced by operators in process industries is discussed such as fault management in unexpected situations, and what is likely to minimise human error in these circumstances. (author)

  5. An Interaction of Screen Colour and Lesson Task in CAL

    Science.gov (United States)

    Clariana, Roy B.

    2004-01-01

    Colour is a common feature in computer-aided learning (CAL), though the instructional effects of screen colour are not well understood. This investigation considers the effects of different CAL study tasks with feedback on posttest performance and on posttest memory of the lesson colour scheme. Graduate students (n=68) completed a computer-based…

  6. Effects of noise and task loading on a communication task loading on a communication task

    Science.gov (United States)

    Orrell, Dean H., II

    Previous research had shown the effect of noise on a single communication task. This research has been criticized as not being representative of a real world situation since subjects allocated all of their attention to only one task. In the present study, the effect of adding a loading task to a standard noise-communication paradigm was investigated. Subjects performed both a communication task (Modified Rhyme Test; House et al. 1965) and a short term memory task (Sternberg, 1969) in simulated levels of aircraft noise (95, 105 and 115 dB overall sound pressure level (OASPL)). Task loading was varied with Sternberg's task by requiring subjects to memorize one, four, or six alphanumeric characters. Simulated aircraft noise was varied between levels of 95, 105 and 115 dB OASPL using a pink noise source. Results show that the addition of Sternberg's task and little effect on the intelligibility of the communication task while response time for the communication task increased.

  7. Quantifying graininess of glossy food products

    DEFF Research Database (Denmark)

    Møller, Flemming; Carstensen, Jens Michael

    The sensory quality of yoghurt can be altered when changing the milk composition or processing conditions. Part of the sensory quality may be assessed visually. It is described how a non-contact method for quantifying surface gloss and grains in yoghurt can be made. It was found that the standard...

  8. Quantifying antimicrobial resistance at veal calf farms

    NARCIS (Netherlands)

    Bosman, A.B.; Wagenaar, J.A.; Stegeman, A.; Vernooij, H.; Mevius, D.J.

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From

  9. QS Spiral: Visualizing Periodic Quantified Self Data

    DEFF Research Database (Denmark)

    Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann

    2013-01-01

    In this paper we propose an interactive visualization technique QS Spiral that aims to capture the periodic properties of quantified self data and let the user explore those recurring patterns. The approach is based on time-series data visualized as a spiral structure. The interactivity includes ...

  10. Quantifying recontamination through factory environments - a review

    NARCIS (Netherlands)

    Asselt-den Aantrekker, van E.D.; Boom, R.M.; Zwietering, M.H.; Schothorst, van M.

    2003-01-01

    Recontamination of food products can be the origin of foodborne illnesses and should therefore be included in quantitative microbial risk assessment (MRA) studies. In order to do this, recontamination should be quantified using predictive models. This paper gives an overview of the relevant

  11. Quantifying quantum coherence with quantum Fisher information.

    Science.gov (United States)

    Feng, X N; Wei, L F

    2017-11-14

    Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.

  12. Interbank exposures: quantifying the risk of contagion

    OpenAIRE

    C. H. Furfine

    1999-01-01

    This paper examines the likelihood that failure of one bank would cause the subsequent collapse of a large number of other banks. Using unique data on interbank payment flows, the magnitude of bilateral federal funds exposures is quantified. These exposures are used to simulate the impact of various failure scenarios, and the risk of contagion is found to be economically small.

  13. Quantifying Productivity Gains from Foreign Investment

    NARCIS (Netherlands)

    C. Fons-Rosen (Christian); S. Kalemli-Ozcan (Sebnem); B.E. Sorensen (Bent); C. Villegas-Sanchez (Carolina)

    2013-01-01

    textabstractWe quantify the causal effect of foreign investment on total factor productivity (TFP) using a new global firm-level database. Our identification strategy relies on exploiting the difference in the amount of foreign investment by financial and industrial investors and simultaneously

  14. Power Curve Measurements, quantify the production increase

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    The purpose of this report is to quantify the production increase on a given turbine with respect to another given turbine. The used methodology is the “side by side” comparison method, provided by the client. This method involves the use of two neighboring turbines and it is based...

  15. Quantifying capital goods for waste landfilling

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Stentsøe, Steen; Willumsen, Hans Christian

    2013-01-01

    Materials and energy used for construction of a hill-type landfill of 4 million m3 were quantified in detail. The landfill is engineered with a liner and leachate collections system, as well as a gas collection and control system. Gravel and clay were the most common materials used, amounting...

  16. Quantifying interspecific coagulation efficiency of phytoplankton

    DEFF Research Database (Denmark)

    Hansen, J.L.S.; Kiørboe, Thomas

    1997-01-01

    . nordenskjoeldii. Mutual coagulation between Skeletonema costatum and the non-sticky cel:ls of Ditylum brightwellii also proceeded with hall the efficiency of S. costatum alone. The latex beads were suitable to be used as 'standard particles' to quantify the ability of phytoplankton to prime aggregation...

  17. New frontiers of quantified self 2

    DEFF Research Database (Denmark)

    Rapp, Amon; Cena, Federica; Kay, Judy

    2016-01-01

    While the Quantified Self (QS) community is described in terms of "self-knowledge through numbers" people are increasingly demanding value and meaning. In this workshop we aim at refocusing the QS debate on the value of data for providing new services....

  18. Quantifying temporal ventriloquism in audiovisual synchrony perception

    NARCIS (Netherlands)

    Kuling, I.A.; Kohlrausch, A.G.; Juola, J.F.

    2013-01-01

    The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from

  19. Reliability-How to Quantify and Improve?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 5. Reliability - How to Quantify and Improve? - Improving the Health of Products. N K Srinivasan. General Article Volume 5 Issue 5 May 2000 pp 55-63. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  1. Opening the Frey/Osborne Black Box: Which Tasks of a Job are Susceptible to Computerization?

    OpenAIRE

    Brandes, Philipp; Wattenhofer, Roger

    2016-01-01

    In their seminal paper, Frey and Osborne quantified the automation of jobs, by assigning each job in the O*NET database a probability to be automated. In this paper, we refine their results in the following way: Every O*NET job consists of a set of tasks, and these tasks can be related. We use a linear program to assign probabilities to tasks, such that related tasks have a similar probability and the tasks can explain the computerization probability of a job. Analyzing jobs on the level of t...

  2. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    classical information theory and, arguably, quantum from classical physics. Basic quantum information ideas are next outlined, including qubits and data compression, quantum gates, the 'no cloning' property and teleportation. Quantum cryptography is briefly sketched. The universal quantum computer (QC) is described, based on the Church-Turing principle and a network model of computation. Algorithms for such a computer are discussed, especially those for finding the period of a function, and searching a random list. Such algorithms prove that a QC of sufficiently precise construction is not only fundamentally different from any computer which can only manipulate classical information, but can compute a small class of functions with greater efficiency. This implies that some important computational tasks are impossible for any device apart from a QC. To build a universal QC is well beyond the abilities of current technology. However, the principles of quantum information physics can be tested on smaller devices. The current experimental situation is reviewed, with emphasis on the linear ion trap, high-Q optical cavities, and nuclear magnetic resonance methods. These allow coherent control in a Hilbert space of eight dimensions (three qubits) and should be extendable up to a thousand or more dimensions (10 qubits). Among other things, these systems will allow the feasibility of quantum computing to be assessed. In fact such experiments are so difficult that it seemed likely until recently that a practically useful QC (requiring, say, 1000 qubits) was actually ruled out by considerations of experimental imprecision and the unavoidable coupling between any system and its environment. However, a further fundamental part of quantum information physics provides a solution to this impasse. This is quantum error correction (QEC). An introduction to QEC is provided. The evolution of the QC is restricted to a carefully chosen subspace of its Hilbert space. Errors are almost certain to

  3. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    classical information theory and, arguably, quantum from classical physics. Basic quantum information ideas are next outlined, including qubits and data compression, quantum gates, the 'no cloning' property and teleportation. Quantum cryptography is briefly sketched. The universal quantum computer (QC) is described, based on the Church-Turing principle and a network model of computation. Algorithms for such a computer are discussed, especially those for finding the period of a function, and searching a random list. Such algorithms prove that a QC of sufficiently precise construction is not only fundamentally different from any computer which can only manipulate classical information, but can compute a small class of functions with greater efficiency. This implies that some important computational tasks are impossible for any device apart from a QC. To build a universal QC is well beyond the abilities of current technology. However, the principles of quantum information physics can be tested on smaller devices. The current experimental situation is reviewed, with emphasis on the linear ion trap, high-Q optical cavities, and nuclear magnetic resonance methods. These allow coherent control in a Hilbert space of eight dimensions (three qubits) and should be extendable up to a thousand or more dimensions (10 qubits). Among other things, these systems will allow the feasibility of quantum computing to be assessed. In fact such experiments are so difficult that it seemed likely until recently that a practically useful QC (requiring, say, 1000 qubits) was actually ruled out by considerations of experimental imprecision and the unavoidable coupling between any system and its environment. However, a further fundamental part of quantum information physics provides a solution to this impasse. This is quantum error correction (QEC). An introduction to QEC is provided. The evolution of the QC is restricted to a carefully chosen subspace of its Hilbert space. Errors are almost certain to

  4. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  5. Efficient task assignment in spatial crowdsourcing with worker and task privacy protection

    KAUST Repository

    Liu, An

    2017-08-01

    Spatial crowdsourcing (SC) outsources tasks to a set of workers who are required to physically move to specified locations and accomplish tasks. Recently, it is emerging as a promising tool for emergency management, as it enables efficient and cost-effective collection of critical information in emergency such as earthquakes, when search and rescue survivors in potential ares are required. However in current SC systems, task locations and worker locations are all exposed in public without any privacy protection. SC systems if attacked thus have penitential risk of privacy leakage. In this paper, we propose a protocol for protecting the privacy for both workers and task requesters while maintaining the functionality of SC systems. The proposed protocol is built on partially homomorphic encryption schemes, and can efficiently realize complex operations required during task assignment over encrypted data through a well-designed computation strategy. We prove that the proposed protocol is privacy-preserving against semi-honest adversaries. Simulation on two real-world datasets shows that the proposed protocol is more effective than existing solutions and can achieve mutual privacy-preserving with acceptable computation and communication cost.

  6. Building a cluster computer for the computing grid of tomorrow

    International Nuclear Information System (INIS)

    Wezel, J. van; Marten, H.

    2004-01-01

    The Grid Computing Centre Karlsruhe takes part in the development, test and deployment of hardware and cluster infrastructure, grid computing middleware, and applications for particle physics. The construction of a large cluster computer with thousands of nodes and several PB data storage capacity is a major task and focus of research. CERN based accelerator experiments will use GridKa, one of only 8 world wide Tier-1 computing centers, for its huge computer demands. Computing and storage is provided already for several other running physics experiments on the exponentially expanding cluster. (orig.)

  7. Children's Task Engagement during Challenging Puzzle Tasks

    Science.gov (United States)

    Wang, Feihong; Algina, James; Snyder, Patricia; Cox, Martha

    2017-01-01

    We examined children's task engagement during a challenging puzzle task in the presence of their primary caregivers by using a representative sample of rural children from six high-poverty counties across two states. Weighted longitudinal confirmatory factor analysis and structural equation modeling were used to identify a task engagement factor…

  8. Practice makes perfect: familiarity of task determines success in solvable tasks for free-ranging dogs (Canis lupus familiaris).

    Science.gov (United States)

    Bhattacharjee, Debottam; Dasgupta, Sandipan; Biswas, Arpita; Deheria, Jayshree; Gupta, Shreya; Nikhil Dev, N; Udell, Monique; Bhadra, Anindita

    2017-07-01

    Domestic dogs' (Canis lupus familiaris) socio-cognitive faculties have made them highly sensitive to human social cues. While dogs often excel at understanding human communicative gestures, they perform comparatively poorly in problem-solving and physical reasoning tasks. This difference in their behaviour could be due to the lifestyle and intense socialization, where problem solving and physical cognition are less important than social cognition. Free-ranging dogs live in human-dominated environments, not under human supervision and are less socialized. Being scavengers, they often encounter challenges where problem solving is required in order to get access to food. We tested Indian street dogs in familiar and unfamiliar independent solvable tasks and quantified their persistence and dependence on a novel human experimenter, in addition to their success in solving a task. Our results indicate that free-ranging dogs succeeded and persisted more in the familiar task as compared to the unfamiliar one. They showed negligible amount of human dependence in the familiar task, but showed prolonged gazing and considerable begging behaviour to the human experimenter in the context of the unfamiliar task. Cognitive abilities of free-ranging dogs thus play a pivotal role in determining task-associated behaviours based on familiarity. In addition to that, these dogs inherently tend to socialize with and depend on humans, even if they are strangers. Our results also illustrate free-ranging dogs' low competence at physical cognitive tasks.

  9. Computing in the Clouds

    Science.gov (United States)

    Johnson, Doug

    2010-01-01

    Web-based applications offer teachers, students, and school districts a convenient way to accomplish a wide range of tasks, from accounting to word processing, for free. Cloud computing has the potential to offer staff and students better services at a lower cost than the technology deployment models they're using now. Saving money and improving…

  10. Using heteroclinic orbits to quantify topological entropy in fluid flows

    International Nuclear Information System (INIS)

    Sattari, Sulimon; Chen, Qianting; Mitchell, Kevin A.

    2016-01-01

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or “ghost,” rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding of ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.

  11. Checkpointing for a hybrid computing node

    Science.gov (United States)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  12. Blink activity and task difficulty.

    Science.gov (United States)

    Tanaka, Y; Yamaoka, K

    1993-08-01

    This study investigated the relationship between task difficulty and blink activity, which includes blink rate, blink amplitude, and blink duration. Two kinds of tasks established two levels of difficulty. In Exp. 1, a mental arithmetic task was used to examine the relationship. Analysis showed that blink rate for a difficult task was significantly higher than that for an easier one. In Exp. 2, a letter-search task (hiragana Japanese alphabet) was used while the other conditions were the same as those in Exp. 1; however, the results of this experiment were not influenced by the difficulty of the task. As results indicate that blink rate is related to not only difficulty but also the nature of the task, the nature of the task is probably dependent on a mechanism in information processing. The results for blink amplitude and blink duration showed no systematic change during either experiment.

  13. Modeling Network Interdiction Tasks

    Science.gov (United States)

    2015-09-17

    118 xiii Table Page 36 Computation times for weighted, 100-node random networks for GAND Approach testing in Python ...in Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 38 Accuracy measures for weighted, 100-node random networks for GAND...networks [15:p. 1]. A common approach to modeling network interdiction is to formulate the problem in terms of a two-stage strategic game between two

  14. Masticatory muscle activity during deliberately performed oral tasks

    International Nuclear Information System (INIS)

    Farella, M; Palla, S; Erni, S; Gallo, L M; Michelotti, A

    2008-01-01

    The aim of this study was to investigate masticatory muscle activity during deliberately performed functional and non-functional oral tasks. Electromyographic (EMG) surface activity was recorded unilaterally from the masseter, anterior temporalis and suprahyoid muscles in 11 subjects (5 men, 6 women; age = 34.6 ± 10.8 years), who were accurately instructed to perform 30 different oral tasks under computer guidance using task markers. Data were analyzed by descriptive statistics, repeated measurements analysis of variance (ANOVA) and hierarchical cluster analysis. The maximum EMG amplitude of the masseter and anterior temporalis muscles was more often found during hard chewing tasks than during maximum clenching tasks. The relative contribution of masseter and anterior temporalis changed across the tasks examined (F ≥ 5.2; p ≤ 0.001). The masseter muscle was significantly (p ≤ 0.05) more active than the anterior temporalis muscle during tasks involving incisal biting, jaw protrusion, laterotrusion and jaw cupping, the difference being statistically significant (p ≤ 0.05). The anterior temporalis muscle was significantly (p ≤ 0.01) more active than the masseter muscle during tasks performed in intercuspal position, during tooth grinding, and during hard chewing on the working side. Based upon the relative contribution of the masseter, anterior temporalis, and suprahyoid muscles, the investigated oral tasks could be grouped into six separate clusters. The findings provided further insight into muscle- and task-specific EMG patterns during functional and non-functional oral behaviors

  15. An opportunity cost model of subjective effort and task performance

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  16. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  17. Biology task group

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    The accomplishments of the task group studies over the past year are reviewed. The purposes of biological investigations, in the context of subseabed disposal, are: an evaluation of the dose to man; an estimation of effects on the ecosystem; and an estimation of the influence of organisms on and as barriers to radionuclide migration. To accomplish these ends, the task group adopted the following research goals: (1) acquire more data on biological accumulation of specific radionuclides, such as those of Tc, Np, Ra, and Sr; (2) acquire more data on transfer coefficients from sediment to organism; (3) Calculate mass transfer rates, construct simple models using them, and estimate collective dose commitment; (4) Identify specific pathways or transfer routes, determine the rates of transfer, and make dose limit calculations with simple models; (5) Calculate dose rates to and estimate irradiation effects on the biota as a result of waste emplacement, by reference to background irradiation calculations. (6) Examine the effect of the biota on altering sediment/water radionuclide exchange; (7) Consider the biological data required to address different accident scenarios; (8) Continue to provide the basic biological information for all of the above, and ensure that the system analysis model is based on the most realistic and up-to-date concepts of marine biologists; and (9) Ensure by way of free exchange of information that the data used in any model are the best currently available

  18. Quantifying Stock Return Distributions in Financial Markets.

    Science.gov (United States)

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales.

  19. A masking index for quantifying hidden glitches

    OpenAIRE

    Berti-Equille, Laure; Loh, J. M.; Dasu, T.

    2015-01-01

    Data glitches are errors in a dataset. They are complex entities that often span multiple attributes and records. When they co-occur in data, the presence of one type of glitch can hinder the detection of another type of glitch. This phenomenon is called masking. In this paper, we define two important types of masking and propose a novel, statistically rigorous indicator called masking index for quantifying the hidden glitches. We outline four cases of masking: outliers masked by missing valu...

  20. How are the catastrophical risks quantifiable

    International Nuclear Information System (INIS)

    Chakraborty, S.

    1985-01-01

    For the assessment and evaluation of industrial risks the question must be asked how are the catastrophical risks quantifiable. Typical real catastrophical risks and risk assessment based on modelling assumptions have been placed against each other in order to put the risks into proper perspective. However, the society is risk averse when there is a catastrophic potential of severe accidents in a large scale industrial facility even though there is extremely low probability of occurence. (orig.) [de