WorldWideScience

Sample records for additional computational effort

  1. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  2. Enhancing school-based asthma education efforts using computer-based education for children.

    Science.gov (United States)

    Nabors, Laura A; Kockritz, Jennifer L; Ludke, Robert L; Bernstein, Jonathan A

    2012-03-01

    Schools are an important site for delivery of asthma education programs. Computer-based educational programs are a critical component of asthma education programs and may be a particularly important education method in busy school environments. The objective of this brief report is to review and critique computer-based education efforts in schools. The results of our literature review indicated that school-based computer education efforts are related to improved knowledge about asthma and its management. In some studies, improvements in clinical outcomes also occur. Data collection programs need to be built into games that improve knowledge. Many projects do not appear to last for periods greater than 1 year and little information is available about cultural relevance of these programs. Educational games and other programs are effective methods of delivering knowledge about asthma management and control. Research about the long-term effects of this increased knowledge, in regard to behavior change, is needed. Additionally, developing sustainable projects, which are culturally relevant, is a goal for future research.

  3. NASA OSMA NDE Program Additive Manufacturing Foundational Effort

    Science.gov (United States)

    Waller, Jess; Walker, James; Burke, Eric; Wells, Douglas; Nichols, Charles

    2016-01-01

    NASA is providing key leadership in an international effort linking NASA and non-NASA resources to speed adoption of additive manufacturing (AM) to meet NASA's mission goals. Participants include industry, NASA's space partners, other government agencies, standards organizations and academia. Nondestructive Evaluation (NDE) is identified as a universal need for all aspects of additive manufacturing.

  4. EU grid computing effort takes on malaria

    CERN Multimedia

    Lawrence, Stacy

    2006-01-01

    Malaria is the world's most common parasitic infection, affecting more thatn 500 million people annually and killing more than 1 million. In order to help combat malaria, CERN has launched a grid computing effort (1 page)

  5. Developing a framework for assessing muscle effort and postures during computer work in the field: The effect of computer activities on neck/shoulder muscle effort and postures

    NARCIS (Netherlands)

    Garza, J.L.B.; Eijckelhof, B.H.W.; Johnson, P.W.; Raina, S.M.; Rynell, P.; Huysmans, M.A.; Dieën, J.H. van; Beek, A.J. van der; Blatter, B.M.; Dennerlein, J.T.

    2012-01-01

    The present study, a part of the PROOF (PRedicting Occupational biomechanics in OFfice workers) study, aimed to determine whether trapezius muscle effort was different across computer activities in a field study of computer workers, and also investigated whether head and shoulder postures were

  6. Developing a framework for assessing muscle effort and postures during computer work in the field: the effect of computer activities on neck/shoulder muscle effort and postures

    NARCIS (Netherlands)

    Bruno-Garza, J.L.; Eijckelhof, B.H.W.; Johnson, P.W.; Raina, S.M.; Rynell, P.; Huijsmans, M.A.; van Dieen, J.H.; van der Beek, A.J.; Blatter, B.M.; Dennerlein, J.T.

    2012-01-01

    The present study, a part of the PROOF (PRedicting Occupational biomechanics in OFfice workers) study, aimed to determine whether trapezius muscle effort was different across computer activities in a field study of computer workers, and also investigated whether head and shoulder postures were

  7. Defense Additive Manufacturing: DOD Needs to Systematically Track Department-wide 3D Printing Efforts

    Science.gov (United States)

    2015-10-01

    the chair of the group, while the teams each have some level of activity in additive manufacturing , it is not identified as one of the teams. Page...DEFENSE ADDITIVE MANUFACTURING DOD Needs to Systematically Track Department-wide 3D Printing Efforts Report to...2015 to 00-00-2015 4. TITLE AND SUBTITLE Defense Additive Manufacturing : DOD Needs to Systematically Track Department-wide 3D Printing Efforts 5a

  8. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    Science.gov (United States)

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort

    Directory of Open Access Journals (Sweden)

    Eliana Vassena

    2017-06-01

    Full Text Available In the last two decades the anterior cingulate cortex (ACC has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  10. Motivational beliefs, student effort, and feedback behaviour in computer-based formative assessment

    NARCIS (Netherlands)

    Timmers, C.F.; Braber-van den Broek, J.; van den Berg, Stéphanie Martine

    2013-01-01

    Feedback can only be effective when students seek feedback and process it. This study examines the relations between students' motivational beliefs, effort invested in a computer-based formative assessment, and feedback behaviour. Feedback behaviour is represented by whether a student seeks feedback

  11. DARPA-funded efforts in the development of novel brain-computer interface technologies.

    Science.gov (United States)

    Miranda, Robbin A; Casebeer, William D; Hein, Amy M; Judy, Jack W; Krotkov, Eric P; Laabs, Tracy L; Manzo, Justin E; Pankratz, Kent G; Pratt, Gill A; Sanchez, Justin C; Weber, Douglas J; Wheeler, Tracey L; Ling, Geoffrey S F

    2015-04-15

    The Defense Advanced Research Projects Agency (DARPA) has funded innovative scientific research and technology developments in the field of brain-computer interfaces (BCI) since the 1970s. This review highlights some of DARPA's major advances in the field of BCI, particularly those made in recent years. Two broad categories of DARPA programs are presented with respect to the ultimate goals of supporting the nation's warfighters: (1) BCI efforts aimed at restoring neural and/or behavioral function, and (2) BCI efforts aimed at improving human training and performance. The programs discussed are synergistic and complementary to one another, and, moreover, promote interdisciplinary collaborations among researchers, engineers, and clinicians. Finally, this review includes a summary of some of the remaining challenges for the field of BCI, as well as the goals of new DARPA efforts in this domain. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  12. The Effort Paradox: Effort Is Both Costly and Valued.

    Science.gov (United States)

    Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y

    2018-04-01

    According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Technology and Sexuality--What's the Connection? Addressing Youth Sexualities in Efforts to Increase Girls' Participation in Computing

    Science.gov (United States)

    Ashcraft, Catherine

    2015-01-01

    To date, girls and women are significantly underrepresented in computer science and technology. Concerns about this underrepresentation have sparked a wealth of educational efforts to promote girls' participation in computing, but these programs have demonstrated limited impact on reversing current trends. This paper argues that this is, in part,…

  14. Efforts to transform computers reach milestone

    CERN Multimedia

    Johnson, G

    2001-01-01

    Scientists in San Jose, Californina, have performed the most complex calculation ever using a quantum computer - factoring the number 15. In contast to the switches in conventional computers, which although tiny consist of billions of atoms, quantum computations are carried out by manipulating single atoms. The laws of quantum mechanics which govern these actions in fact mean that multiple computations could be done in parallel, this would drastically cut down the time needed to carry out very complex calculations.

  15. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    Science.gov (United States)

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO

  16. The effect of sleep loss on next day effort.

    Science.gov (United States)

    Engle-Friedman, Mindy; Riela, Suzanne; Golan, Rama; Ventuneac, Ana M; Davis, Christine M; Jefferson, Angela D; Major, Donna

    2003-06-01

    The study had two primary objectives. The first was to determine whether sleep loss results in a preference for tasks demanding minimal effort. The second was to evaluate the quality of performance when participants, under conditions of sleep loss, have control over task demands. In experiment 1, using a repeated-measures design, 50 undergraduate college students were evaluated, following one night of no sleep loss and one night of sleep loss. The Math Effort Task (MET) presented addition problems via computer. Participants were able to select additions at one of five levels of difficulty. Less-demanding problems were selected and more additions were solved correctly when the participants were subject to sleep loss. In experiment 2, 58 undergraduate college students were randomly assigned to a no sleep deprivation or a sleep deprivation condition. Sleep-deprived participants selected less-demanding problems on the MET. Percentage correct on the MET was equivalent for both the non-sleep-deprived and sleep-deprived groups. On a task selection question, the sleep-deprived participants also selected significantly less-demanding non-academic tasks. Increased sleepiness, fatigue, and reaction time were associated with the selection of less difficult tasks. Both groups of participants reported equivalent effort expenditures; sleep-deprived participants did not perceive a reduction in effort. These studies demonstrate that sleep loss results in the choice of low-effort behavior that helps maintain accurate responding.

  17. ``But it doesn't come naturally'': how effort expenditure shapes the benefit of growth mindset on women's sense of intellectual belonging in computing

    Science.gov (United States)

    Stout, Jane G.; Blaney, Jennifer M.

    2017-10-01

    Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.

  18. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  19. Effort in Multitasking: Local and Global Assessment of Effort.

    Science.gov (United States)

    Kiesel, Andrea; Dignath, David

    2017-01-01

    When performing multiple tasks in succession, self-organization of task order might be superior compared to external-controlled task schedules, because self-organization allows optimizing processing modes and thus reduces switch costs, and it increases commitment to task goals. However, self-organization is an additional executive control process that is not required if task order is externally specified and as such it is considered as time-consuming and effortful. To compare self-organized and externally controlled task scheduling, we suggest assessing global subjective and objectives measures of effort in addition to local performance measures. In our new experimental approach, we combined characteristics of dual tasking settings and task switching settings and compared local and global measures of effort in a condition with free choice of task sequence and a condition with cued task sequence. In a multi-tasking environment, participants chose the task order while the task requirement of the not-yet-performed task remained the same. This task preview allowed participants to work on the previously non-chosen items in parallel and resulted in faster responses and fewer errors in task switch trials than in task repetition trials. The free-choice group profited more from this task preview than the cued group when considering local performance measures. Nevertheless, the free-choice group invested more effort than the cued group when considering global measures. Thus, self-organization in task scheduling seems to be effortful even in conditions in which it is beneficiary for task processing. In a second experiment, we reduced the possibility of task preview for the not-yet-performed tasks in order to hinder efficient self-organization. Here neither local nor global measures revealed substantial differences between the free-choice and a cued task sequence condition. Based on the results of both experiments, we suggest that global assessment of effort in addition to

  20. DOD Financial Management: Additional Efforts Needed to Improve Audit Readiness of Navy Military Pay and Other Related Activities

    Science.gov (United States)

    2015-09-01

    of Major Systems Involved in Processing and Reporting Navy Military Payroll 8 Figure 3: Management Representation Letter Timeline for the April 2013...Figure 3: Management Representation Letter Timeline for the April 2013 Military Payroll Examination Without a policy that addresses...DOD FINANCIAL MANAGEMENT Additional Efforts Needed to Improve Audit Readiness of Navy Military Pay and Other Related

  1. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  2. [Effects of nurses' perception of servant leadership on leader effectiveness, satisfaction and additional effort: focused on the mediating effects of leader trust and value congruence].

    Science.gov (United States)

    Han, Sang Sook; Kim, Nam Eun

    2012-02-01

    This study was done to examine the effects of nurses' perception of servant leadership on leader effectiveness, satisfaction and promoting additional effort. The focus was the mediating effects of leader trust and value congruence. Data were collected from 361 RN-BSN students and nurses participating in nationally attended in-service training programs. Data were analyzed using descriptive statistics and structural analysis with SPSS 17.0 windows program and Amos 7.0. Direct effects of nurses' perception of servant leadership were negative, but mediating effects of trust and value congruency were positively correlated with leader effectiveness, satisfaction and additional effort, that is servant leadership should be effective through mediating factors. The study results indicate that if the middle managers of nurses can build leader trust and value congruency between nurses through servant leadership, leader effectiveness, satisfaction and additional effort on the part of the nurses could result in a positive change in the long term.

  3. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    Science.gov (United States)

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Polylactides in additive biomanufacturing.

    Science.gov (United States)

    Poh, Patrina S P; Chhaya, Mohit P; Wunner, Felix M; De-Juan-Pardo, Elena M; Schilling, Arndt F; Schantz, Jan-Thorsten; van Griensven, Martijn; Hutmacher, Dietmar W

    2016-12-15

    New advanced manufacturing technologies under the alias of additive biomanufacturing allow the design and fabrication of a range of products from pre-operative models, cutting guides and medical devices to scaffolds. The process of printing in 3 dimensions of cells, extracellular matrix (ECM) and biomaterials (bioinks, powders, etc.) to generate in vitro and/or in vivo tissue analogue structures has been termed bioprinting. To further advance in additive biomanufacturing, there are many aspects that we can learn from the wider additive manufacturing (AM) industry, which have progressed tremendously since its introduction into the manufacturing sector. First, this review gives an overview of additive manufacturing and both industry and academia efforts in addressing specific challenges in the AM technologies to drive toward AM-enabled industrial revolution. After which, considerations of poly(lactides) as a biomaterial in additive biomanufacturing are discussed. Challenges in wider additive biomanufacturing field are discussed in terms of (a) biomaterials; (b) computer-aided design, engineering and manufacturing; (c) AM and additive biomanufacturing printers hardware; and (d) system integration. Finally, the outlook for additive biomanufacturing was discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A Methodology to Reduce the Computational Effort in the Evaluation of the Lightning Performance of Distribution Networks

    Directory of Open Access Journals (Sweden)

    Ilaria Bendato

    2016-11-01

    Full Text Available The estimation of the lightning performance of a power distribution network is of great importance to design its protection system against lightning. An accurate evaluation of the number of lightning events that can create dangerous overvoltages requires a huge computational effort, as it implies the adoption of a Monte Carlo procedure. Such a procedure consists of generating many different random lightning events and calculating the corresponding overvoltages. The paper proposes a methodology to deal with the problem in two computationally efficient ways: (i finding out the minimum number of Monte Carlo runs that lead to reliable results; and (ii setting up a procedure that bypasses the lightning field-to-line coupling problem for each Monte Carlo run. The proposed approach is shown to provide results consistent with existing approaches while exhibiting superior Central Processing Unit (CPU time performances.

  6. ICRP new recommendations. Committee 2's efforts

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    The International Commission on Radiological Protection (ICRP) may release new primary radiation protection recommendation in 2007. Committee 2 has underway reviews of the dosimetric and biokinetic models and associated data used in calculating dose coefficients for intakes of radionuclides and exposures to external radiation fields. This paper outlines the work plans of Committee 2 during the current term, 2005-2009, in anticipation of the new primary recommendations. The two task groups of Committee 2 responsible for the computations of dose coefficients, INDOS and DOCAL, are reviewing the models and data used in the computations. INDOS is reviewing the lung model and the biokinetic models that describe the behavior of the radionuclides in the body. DOCAL is reviewing its computational formulations with the objective of harmonizing the formulation with those of nuclear medicine, and developing new computational phantoms representing the adult male and female reference individuals of ICRP Publication 89. In addition, DOCAL will issue a publication on nuclear decay data to replace ICRP Publication 38. While the current efforts are focused on updating the dose coefficients for occupational intakes of radionuclides plans are being formulated to address dose coefficients for external radiation fields which include consideration of high energy fields associated with accelerators and space travel and the updating of dose coefficients for members of the public. (author)

  7. Department of Energy Efforts to Promote Universal Adherence to the IAEA Additional Protocol

    International Nuclear Information System (INIS)

    Killinger, Mark H.; Hansen, Linda H.; Kovacic, Don N.; VanSickle, Matthew; Apt, Kenneth E.

    2009-01-01

    Entry-into-force of the U.S. Additional Protocol (AP) in January 2009 continues to demonstrate the ongoing commitment by the United States to promote universal adherence to the AP. The AP is a critical tool for improving the International Atomic Energy Agency's (IAEA) capabilities to detect undeclared activities that indicate a clandestine nuclear weapons program. This is because States Parties are required to provide information about, and access to, nuclear fuel cycle activities beyond their traditional safeguards reporting requirements. As part of the U.S. AP Implementation Act and Senate Resolution of Ratification, the Administration is required to report annually to Congress on measures taken to achieve the adoption of the AP in non-nuclear weapon states, as well as assistance to the IAEA to promote the effective implementation of APs in those states. A key U.S. effort in this area is being managed by the International Nuclear Safeguards and Engagement Program (INSEP) of the U.S. Department of Energy (DOE). Through new and existing bilateral cooperation agreements, INSEP has initiated technical assistance projects for AP implementation with selected non-weapon states. States with which INSEP is currently cooperating include Vietnam and Thailand, with Indonesia, Algeria, Morocco, and other countries as possible future collaborators in the area of AP implementation. The INSEP collaborative model begins with a joint assessment with our partners to identify specific needs they may have regarding entering the AP into force and any impediments to successful implementation. An action plan is then developed detailing and prioritizing the necessary joint activities. Such assistance may include: advice on developing legal frameworks and regulatory documents; workshops to promote understanding of AP requirements; training to determine possible declarable activities; assistance in developing a system to collect and submit declarations; performing industry outreach to

  8. An opportunity cost model of subjective effort and task performance

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  9. Competition for marine space: modelling the Baltic Sea fisheries and effort displacement under spatial restrictions

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Eigaard, Ole Ritzau

    2015-01-01

    DISPLACE model) to combine stochastic variations in spatial fishing activities with harvested resource dynamics in scenario projections. The assessment computes economic and stock status indicators by modelling the activity of Danish, Swedish, and German vessels (.12 m) in the international western Baltic...... Sea commercial fishery, together with the underlying size-based distribution dynamics of the main fishery resources of sprat, herring, and cod. The outcomes of alternative scenarios for spatial effort displacement are exemplified by evaluating the fishers’s abilities to adapt to spatial plans under...... various constraints. Interlinked spatial, technical, and biological dynamics of vessels and stocks in the scenarios result in stable profits, which compensate for the additional costs from effort displacement and release pressure on the fish stocks. The effort is further redirected away from sensitive...

  10. Training auscultatory skills: computer simulated heart sounds or additional bedside training? A randomized trial on third-year medical students

    Science.gov (United States)

    2010-01-01

    Background The present study compares the value of additional use of computer simulated heart sounds, to conventional bedside auscultation training, on the cardiac auscultation skills of 3rd year medical students at Oslo University Medical School. Methods In addition to their usual curriculum courses, groups of seven students each were randomized to receive four hours of additional auscultation training either employing a computer simulator system or adding on more conventional bedside training. Cardiac auscultation skills were afterwards tested using live patients. Each student gave a written description of the auscultation findings in four selected patients, and was rewarded from 0-10 points for each patient. Differences between the two study groups were evaluated using student's t-test. Results At the auscultation test no significant difference in mean score was found between the students who had used additional computer based sound simulation compared to additional bedside training. Conclusions Students at an early stage of their cardiology training demonstrated equal performance of cardiac auscultation whether they had received an additional short auscultation course based on computer simulated training, or had had additional bedside training. PMID:20082701

  11. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  12. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

    OpenAIRE

    Walter Ritter

    2011-01-01

    A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

  13. Minimal-effort planning of active alignment processes for beam-shaping optics

    Science.gov (United States)

    Haag, Sebastian; Schranner, Matthias; Müller, Tobias; Zontar, Daniel; Schlette, Christian; Losch, Daniel; Brecher, Christian; Roßmann, Jürgen

    2015-03-01

    In science and industry, the alignment of beam-shaping optics is usually a manual procedure. Many industrial applications utilizing beam-shaping optical systems require more scalable production solutions and therefore effort has been invested in research regarding the automation of optics assembly. In previous works, the authors and other researchers have proven the feasibility of automated alignment of beam-shaping optics such as collimation lenses or homogenization optics. Nevertheless, the planning efforts as well as additional knowledge from the fields of automation and control required for such alignment processes are immense. This paper presents a novel approach of planning active alignment processes of beam-shaping optics with the focus of minimizing the planning efforts for active alignment. The approach utilizes optical simulation and the genetic programming paradigm from computer science for automatically extracting features from a simulated data basis with a high correlation coefficient regarding the individual degrees of freedom of alignment. The strategy is capable of finding active alignment strategies that can be executed by an automated assembly system. The paper presents a tool making the algorithm available to end-users and it discusses the results of planning the active alignment of the well-known assembly of a fast-axis collimator. The paper concludes with an outlook on the transferability to other use cases such as application specific intensity distributions which will benefit from reduced planning efforts.

  14. Design and fabrication of a sleep apnea device using computer-aided design/additive manufacture technologies.

    Science.gov (United States)

    Al Mortadi, Noor; Eggbeer, Dominic; Lewis, Jeffrey; Williams, Robert J

    2013-04-01

    The aim of this study was to analyze the latest innovations in additive manufacture techniques and uniquely apply them to dentistry, to build a sleep apnea device requiring rotating hinges. Laser scanning was used to capture the three-dimensional topography of an upper and lower dental cast. The data sets were imported into an appropriate computer-aided design software environment, which was used to design a sleep apnea device. This design was then exported as a stereolithography file and transferred for three-dimensional printing by an additive manufacture machine. The results not only revealed that the novel computer-based technique presented provides new design opportunities but also highlighted limitations that must be addressed before the techniques can become clinically viable.

  15. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  16. Pitfalls in Designing Zero-Effort Deauthentication: Opportunistic Human Observation Attacks

    OpenAIRE

    Huhta, O.; Shrestha, P.; Udar, S.; Juuti, M.; Saxena, N.; Asokan, N.

    2015-01-01

    VK: Asokan, N. Deauthentication is an important component of any authentication system. The widespread use of computing devices in daily life has underscored the need for zero-effort deauthentication schemes. However, the quest for eliminating user effort may lead to hidden security flaws in the authentication schemes. As a case in point, we investigate a prominent zero-effort deauthentication scheme, called ZEBRA, which provides an interesting and a useful solution to a difficult problem ...

  17. Additive manufacturing technology in reconstructive surgery.

    Science.gov (United States)

    Fuller, Scott C; Moore, Michael G

    2016-10-01

    Technological advances have been part and parcel of modern reconstructive surgery, in that practitioners of this discipline are continually looking for innovative ways to perfect their craft and improve patient outcomes. We are currently in a technological climate wherein advances in computers, imaging, and science have coalesced with resulting innovative breakthroughs that are not merely limited to improved outcomes and enhanced patient care, but may provide novel approaches to training the next generation of reconstructive surgeons. New developments in software and modeling platforms, imaging modalities, tissue engineering, additive manufacturing, and customization of implants are poised to revolutionize the field of reconstructive surgery. The interface between technological advances and reconstructive surgery continues to expand. Additive manufacturing techniques continue to evolve in an effort to improve patient outcomes, decrease operative time, and serve as instructional tools for the training of reconstructive surgeons.

  18. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  19. Effort Estimation in BPMS Migration

    Directory of Open Access Journals (Sweden)

    Christopher Drews

    2018-04-01

    Full Text Available Usually Business Process Management Systems (BPMS are highly integrated in the IT of organizations and are at the core of their business. Thus, migrating from one BPMS solution to another is not a common task. However, there are forces that are pushing organizations to perform this step, e.g. maintenance costs of legacy BPMS or the need for additional functionality. Before the actual migration, the risk and the effort must be evaluated. This work provides a framework for effort estimation regarding the technical aspects of BPMS migration. The framework provides questions for BPMS comparison and an effort evaluation schema. The applicability of the framework is evaluated based on a simplified BPMS migration scenario.

  20. Effort Estimation in BPMS Migration

    OpenAIRE

    Drews, Christopher; Lantow, Birger

    2018-01-01

    Usually Business Process Management Systems (BPMS) are highly integrated in the IT of organizations and are at the core of their business. Thus, migrating from one BPMS solution to another is not a common task. However, there are forces that are pushing organizations to perform this step, e.g. maintenance costs of legacy BPMS or the need for additional functionality. Before the actual migration, the risk and the effort must be evaluated. This work provides a framework for effort estimation re...

  1. UMRSFFS Additional Mapping Effort

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — Recent developments in digital terrain and geospatial database management technology make it possible to protect this investment for existing and future projects to...

  2. Building a Science Software Institute: Synthesizing the Lessons Learned from the ISEES and WSSI Software Institute Conceptualization Efforts

    Science.gov (United States)

    Idaszak, R.; Lenhardt, W. C.; Jones, M. B.; Ahalt, S.; Schildhauer, M.; Hampton, S. E.

    2014-12-01

    The NSF, in an effort to support the creation of sustainable science software, funded 16 science software institute conceptualization efforts. The goal of these conceptualization efforts is to explore approaches to creating the institutional, sociological, and physical infrastructures to support sustainable science software. This paper will present the lessons learned from two of these conceptualization efforts, the Institute for Sustainable Earth and Environmental Software (ISEES - http://isees.nceas.ucsb.edu) and the Water Science Software Institute (WSSI - http://waters2i2.org). ISEES is a multi-partner effort led by National Center for Ecological Analysis and Synthesis (NCEAS). WSSI, also a multi-partner effort, is led by the Renaissance Computing Institute (RENCI). The two conceptualization efforts have been collaborating due to the complementarity of their approaches and given the potential synergies of their science focus. ISEES and WSSI have engaged in a number of activities to address the challenges of science software such as workshops, hackathons, and coding efforts. More recently, the two institutes have also collaborated on joint activities including training, proposals, and papers. In addition to presenting lessons learned, this paper will synthesize across the two efforts to project a unified vision for a science software institute.

  3. Computational methods in metabolic engineering for strain design.

    Science.gov (United States)

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Stretch-sensitive paresis and effort perception in hemiparesis.

    Science.gov (United States)

    Vinti, Maria; Bayle, Nicolas; Hutin, Emilie; Burke, David; Gracies, Jean-Michel

    2015-08-01

    In spastic paresis, stretch applied to the antagonist increases its inappropriate recruitment during agonist command (spastic co-contraction). It is unknown whether antagonist stretch: (1) also affects agonist recruitment; (2) alters effort perception. We quantified voluntary activation of ankle dorsiflexors, effort perception, and plantar flexor co-contraction during graded dorsiflexion efforts at two gastrocnemius lengths. Eighteen healthy (age 41 ± 13) and 18 hemiparetic (age 54 ± 12) subjects performed light, medium and maximal isometric dorsiflexion efforts with the knee flexed or extended. We determined dorsiflexor torque, Root Mean Square EMG and Agonist Recruitment/Co-contraction Indices (ARI/CCI) from the 500 ms peak voluntary agonist recruitment in a 5-s maximal isometric effort in tibialis anterior, soleus and medial gastrocnemius. Subjects retrospectively reported effort perception on a 10-point visual analog scale. During gastrocnemius stretch in hemiparetic subjects, we observed: (1) a 25 ± 7 % reduction of tibialis anterior voluntary activation (maximum reduction 98 %; knee extended vs knee flexed; p = 0.007, ANOVA); (2) an increase in dorsiflexion effort perception (p = 0.03, ANCOVA). Such changes did not occur in healthy subjects. Effort perception depended on tibialis anterior recruitment only (βARI(TA) = 0.61, p hemiparesis, voluntary ability to recruit agonist motoneurones is impaired--sometimes abolished--by antagonist stretch, a phenomenon defined here as stretch-sensitive paresis. In addition, spastic co-contraction increases effort perception, an additional incentive to evaluate and treat this phenomenon.

  5. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  6. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  7. Trust and Reciprocity: Are Effort and Money Equivalent?

    Science.gov (United States)

    Vilares, Iris; Dam, Gregory; Kording, Konrad

    2011-01-01

    Trust and reciprocity facilitate cooperation and are relevant to virtually all human interactions. They are typically studied using trust games: one subject gives (entrusts) money to another subject, which may return some of the proceeds (reciprocate). Currently, however, it is unclear whether trust and reciprocity in monetary transactions are similar in other settings, such as physical effort. Trust and reciprocity of physical effort are important as many everyday decisions imply an exchange of physical effort, and such exchange is central to labor relations. Here we studied a trust game based on physical effort and compared the results with those of a computationally equivalent monetary trust game. We found no significant difference between effort and money conditions in both the amount trusted and the quantity reciprocated. Moreover, there is a high positive correlation in subjects' behavior across conditions. This suggests that trust and reciprocity may be character traits: subjects that are trustful/trustworthy in monetary settings behave similarly during exchanges of physical effort. Our results validate the use of trust games to study exchanges in physical effort and to characterize inter-subject differences in trust and reciprocity, and also suggest a new behavioral paradigm to study these differences. PMID:21364931

  8. Computational atomic and nuclear physics

    International Nuclear Information System (INIS)

    Bottcher, C.; Strayer, M.R.; McGrory, J.B.

    1990-01-01

    The evolution of parallel processor supercomputers in recent years provides opportunities to investigate in detail many complex problems, in many branches of physics, which were considered to be intractable only a few years ago. But to take advantage of these new machines, one must have a better understanding of how the computers organize their work than was necessary with previous single processor machines. Equally important, the scientist must have this understanding as well as a good understanding of the structure of the physics problem under study. In brief, a new field of computational physics is evolving, which will be led by investigators who are highly literate both computationally and physically. A Center for Computationally Intensive Problems has been established with the collaboration of the University of Tennessee Science Alliance, Vanderbilt University, and the Oak Ridge National Laboratory. The objective of this Center is to carry out forefront research in computationally intensive areas of atomic, nuclear, particle, and condensed matter physics. An important part of this effort is the appropriate training of students. An early effort of this Center was to conduct a Summer School of Computational Atomic and Nuclear Physics. A distinguished faculty of scientists in atomic, nuclear, and particle physics gave lectures on the status of present understanding of a number of topics at the leading edge in these fields, and emphasized those areas where computational physics was in a position to make a major contribution. In addition, there were lectures on numerical techniques which are particularly appropriate for implementation on parallel processor computers and which are of wide applicability in many branches of science

  9. Toward a Rational and Mechanistic Account of Mental Effort.

    Science.gov (United States)

    Shenhav, Amitai; Musslick, Sebastian; Lieder, Falk; Kool, Wouter; Griffiths, Thomas L; Cohen, Jonathan D; Botvinick, Matthew M

    2017-07-25

    In spite of its familiar phenomenology, the mechanistic basis for mental effort remains poorly understood. Although most researchers agree that mental effort is aversive and stems from limitations in our capacity to exercise cognitive control, it is unclear what gives rise to those limitations and why they result in an experience of control as costly. The presence of these control costs also raises further questions regarding how best to allocate mental effort to minimize those costs and maximize the attendant benefits. This review explores recent advances in computational modeling and empirical research aimed at addressing these questions at the level of psychological process and neural mechanism, examining both the limitations to mental effort exertion and how we manage those limited cognitive resources. We conclude by identifying remaining challenges for theoretical accounts of mental effort as well as possible applications of the available findings to understanding the causes of and potential solutions for apparent failures to exert the mental effort required of us.

  10. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  11. Overview of NASA/OAST efforts related to manufacturing technology

    Science.gov (United States)

    Saunders, N. T.

    1976-01-01

    An overview of some of NASA's current efforts related to manufacturing technology and some possible directions for the future are presented. The topics discussed are: computer-aided design, composite structures, and turbine engine components.

  12. Environmental Determinants of Lexical Processing Effort

    OpenAIRE

    McDonald, Scott

    2000-01-01

    Institute for Adaptive and Neural Computation A central concern of psycholinguistic research is explaining the relative ease or difficulty involved in processing words. In this thesis, we explore the connection between lexical processing effort and measurable properties of the linguistic environment. Distributional information (information about a word’s contexts of use) is easily extracted from large language corpora in the form of co-occurrence statistics. We claim that su...

  13. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    Science.gov (United States)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  14. X-ray computed tomography for additive manufacturing: a review

    International Nuclear Information System (INIS)

    Thompson, A; Maskery, I; Leach, R K

    2016-01-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM. (topical review)

  15. X-ray computed tomography for additive manufacturing: a review

    Science.gov (United States)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  16. Computer programs in BASIC language for graphite furnace atomic absorption using the method of additions. Part 1. Operating instructions

    International Nuclear Information System (INIS)

    Boyle, W.G. Jr.; Ryan, D.P.

    1979-01-01

    These instructions describe how to use BASIC language programs to process data from atomic absorption spectrophotometers using the graphite furnace and the method of additions calibration technique. The instructions cover loading the programs, responding to computer prompts, choosing among various options for processing the data, performing operations with an automatic sampler, and producing reports. How the programs interact with each other is also explained. Examples of computer/operator dialogue are presented for typical cases. In addition, a concise set of operating instructions is included as an appendix

  17. PAH growth initiated by propargyl addition: Mechanism development and computational kinetics

    KAUST Repository

    Raj, Abhijeet Dhayal

    2014-04-24

    Polycyclic aromatic hydrocarbon (PAH) growth is known to be the principal pathway to soot formation during fuel combustion, as such, a physical understanding of the PAH growth mechanism is needed to effectively assess, predict, and control soot formation in flames. Although the hydrogen abstraction C2H2 addition (HACA) mechanism is believed to be the main contributor to PAH growth, it has been shown to under-predict some of the experimental data on PAHs and soot concentrations in flames. This article presents a submechanism of PAH growth that is initiated by propargyl (C 3H3) addition onto naphthalene (A2) and the naphthyl radical. C3H3 has been chosen since it is known to be a precursor of benzene in combustion and has appreciable concentrations in flames. This mechanism has been developed up to the formation of pyrene (A4), and the temperature-dependent kinetics of each elementary reaction has been determined using density functional theory (DFT) computations at the B3LYP/6-311++G(d,p) level of theory and transition state theory (TST). H-abstraction, H-addition, H-migration, β-scission, and intramolecular addition reactions have been taken into account. The energy barriers of the two main pathways (H-abstraction and H-addition) were found to be relatively small if not negative, whereas the energy barriers of the other pathways were in the range of (6-89 kcal·mol-1). The rates reported in this study may be extrapolated to larger PAH molecules that have a zigzag site similar to that in naphthalene, and the mechanism presented herein may be used as a complement to the HACA mechanism to improve prediction of PAH and soot formation. © 2014 American Chemical Society.

  18. Addition computed tomography with stable xenon; Special reference to ischemic cerebrovascular diseases

    Energy Technology Data Exchange (ETDEWEB)

    Touho, Hajime; Karasawa, Jun; Shishido, Hisashi; Yamada, Keisuke; Shibamoto, Keiji [Osaka Neurological Inst., Toyonaka (Japan)

    1990-09-01

    Stable xenon (Xe{sup s}) is used as a contrast agent because it freely diffuses to cerebral tissues through the blood-brain barrier. In this study, 2 axial levels for Xe{sup s} enhancement analysis were selected from a baseline series of computed tomographic (CT) scans and 6 serial CT scans were obtained every 20 seconds for each scan level during the 240 seconds inhalation period of 30% Xe{sup s} in 10 volunteer controls and in 52 patients with ischemic cerebrovascular diseases (ICVD). The serial CT scans were added and averaged in each pixel. This was used to make a new CT picture (addition CT scans). The CT scans before the Xe{sup s} inhalation, the scan at the end of the Xe{sup s} inhalation, and the addition CT scan were compared to see whether gray matter and ischemic areas could be differentiated from white matter. The addition CT scans could differentiate the three structures very well in both the acute and chronic stages of ICVD. This technique is thought to be a very simple and useful method to detect the small infarcted areas and low perfusion areas that cannot be visualized on precontrast CT scans. (author).

  19. POEM: Identifying joint additive effects on regulatory circuits

    Directory of Open Access Journals (Sweden)

    Maya eBotzman

    2016-04-01

    Full Text Available Motivation: Expression Quantitative Trait Locus (eQTL mapping tackles the problem of identifying variation in DNA sequence that have an effect on the transcriptional regulatory network. Major computational efforts are aimed at characterizing the joint effects of several eQTLs acting in concert to govern the expression of the same genes. Yet, progress towards a comprehensive prediction of such joint effects is limited. For example, existing eQTL methods commonly discover interacting loci affecting the expression levels of a module of co-regulated genes. Such ‘modularization’ approaches, however, are focused on epistatic relations and thus have limited utility for the case of additive (non-epistatic effects.Results: Here we present POEM (Pairwise effect On Expression Modules, a methodology for identifying pairwise eQTL effects on gene modules. POEM is specifically designed to achieve high performance in the case of additive joint effects. We applied POEM to transcription profiles measured in bone marrow-derived dendritic cells across a population of genotyped mice. Our study reveals widespread additive, trans-acting pairwise effects on gene modules, characterizes their organizational principles, and highlights high-order interconnections between modules within the immune signaling network. These analyses elucidate the central role of additive pairwise effect in regulatory circuits, and provide computational tools for future investigations into the interplay between eQTLs.Availability: The software described in this article is available at csgi.tau.ac.il/POEM/.

  20. Literality and Cognitive Effort

    DEFF Research Database (Denmark)

    Lacruz, Isabel; Carl, Michael; Yamada, Masaru

    2018-01-01

    We introduce a notion of pause-word ratio computed using ranges of pause lengths rather than lower cutoffs for pause lengths. Standard pause-word ratios are indicators of cognitive effort during different translation modalities.The pause range version allows for the study of how different types...... remoteness. We use data from the CRITT TPR database, comparing translation and post-editing from English to Japanese and from English to Spanish, and study the interaction of pause-word ratio for short pauses ranging between 300 and 500ms with syntactic remoteness, measured by the CrossS feature, semantic...... remoteness, measured by HTra, and syntactic and semantic remoteness, measured by Literality....

  1. Computer aids for plant operators

    International Nuclear Information System (INIS)

    Joly, J.P.

    1992-01-01

    For some time, particularly since the TMI accident, nuclear power plant operators have been aware of the difficulties involved in diagnosing accidents and returning plants to their stable, safe operating mode. There are various possible solutions to these problems: improve control organization during accident situations, rewrite control procedures, integrate safety engineers in shifts, improve control rooms, and implement additional computer aids. The purpose of this presentation is to describe the efforts undertaken by EDF over the last few years in this field

  2. A neuronal model of a global workspace in effortful cognitive tasks.

    Science.gov (United States)

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  3. Visual cues and listening effort: individual variability.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2011-10-01

    To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and 2 presentation modalities (audio only [AO] and auditory-visual [AV]). Signal-to-noise ratios were adjusted to provide matched speech recognition across audio-only and AV noise conditions. Also measured were subjective perceptions of listening effort and 2 predictive variables: (a) lipreading ability and (b) WMC. Objective and subjective results indicated that listening effort increased in the presence of noise, but on average the addition of visual cues did not significantly affect the magnitude of listening effort. Although there was substantial individual variability, on average participants who were better lipreaders or had larger WMCs demonstrated reduced listening effort in noise in AV conditions. Overall, the results support the hypothesis that integrating auditory and visual cues requires cognitive resources in some participants. The data indicate that low lipreading ability or low WMC is associated with relatively effortful integration of auditory and visual information in noise.

  4. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  6. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Comparative assessment of world research efforts on magnetic confinement fusion

    International Nuclear Information System (INIS)

    McKenney, B.L.; McGrain, M.; Rutherford, P.H.

    1990-02-01

    This report presents a comparative assessment of the world's four major research efforts on magnetic confinement fusion, including a comparison of the capabilities in the Soviet Union, the European Community (Western Europe), Japan, and the United States. A comparative evaluation is provided in six areas: tokamak confinement; alternate confinement approaches; plasma technology and engineering; and fusion computations. The panel members are involved actively in fusion-related research, and have extensive experience in previous assessments and reviews of the world's four major fusion programs. Although the world's four major fusion efforts are roughly comparable in overall capabilities, two conclusions of this report are inescapable. First, the Soviet fusion effort is presently the weakest of the four programs in most areas of the assessment. Second, if present trends continue, the United States, once unambiguously the world leader in fusion research, will soon lose its position of leadership to the West European and Japanese fusion programs. Indeed, before the middle 1990s, the upgraded large-tokamak facilities, JT-60U (Japan) and JET (Western Europe), are likely to explore plasma conditions and operating regimes well beyond the capabilities of the TFTR tokamak (United States). In addition, if present trends continue in the areas of fusion nuclear technology and materials, and plasma technology and materials, and plasma technology development, the capabilities of Japan and Western Europe in these areas (both with regard to test facilities and fusion-specific industrial capabilities) will surpass those of the United States by a substantial margin before the middle 1990s

  8. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  9. Deconstructing Hub Drag. Part 2. Computational Development and Anaysis

    Science.gov (United States)

    2013-09-30

    leveraged a Vertical Lift Consortium ( VLC )-funded hub drag scaling research effort. To confirm this objective, correlations are performed with the...Technology™ Demonstrator aircraft using an unstructured computational solver. These simpler faired elliptical geome- tries can prove to be challenging ...possible. However, additional funding was obtained from the Vertical Lift Consortium ( VLC ) to perform this study. This analysis is documented in

  10. A future for computational fluid dynamics at CERN

    CERN Document Server

    Battistin, M

    2005-01-01

    Computational Fluid Dynamics (CFD) is an analysis of fluid flow, heat transfer and associated phenomena in physical systems using computers. CFD has been used at CERN since 1993 by the TS-CV group, to solve thermo-fluid related problems, particularly during the development, design and construction phases of the LHC experiments. Computer models based on CFD techniques can be employed to reduce the effort required for prototype testing, saving not only time and money but offering possibilities of additional investigations and design optimisation. The development of a more efficient support team at CERN depends on to two important factors: available computing power and experienced engineers. Available computer power IS the limiting resource of CFD. Only the recent increase of computer power had allowed important high tech and industrial applications. Computer Grid is already now (OpenLab at CERN) and will be more so in the future natural environment for CFD science. At CERN, CFD activities have been developed by...

  11. Computer aided analysis of additional chromosome aberrations in Philadelphia chromosome positive acute lymphoblastic leukaemia using a simplified computer readable cytogenetic notation

    Directory of Open Access Journals (Sweden)

    Mohr Brigitte

    2003-01-01

    Full Text Available Abstract Background The analysis of complex cytogenetic databases of distinct leukaemia entities may help to detect rare recurring chromosome aberrations, minimal common regions of gains and losses, and also hot spots of genomic rearrangements. The patterns of the karyotype alterations may provide insights into the genetic pathways of disease progression. Results We developed a simplified computer readable cytogenetic notation (SCCN by which chromosome findings are normalised at a resolution of 400 bands. Lost or gained chromosomes or chromosome segments are specified in detail, and ranges of chromosome breakpoint assignments are recorded. Software modules were written to summarise the recorded chromosome changes with regard to the respective chromosome involvement. To assess the degree of karyotype alterations the ploidy levels and numbers of numerical and structural changes were recorded separately, and summarised in a complex karyotype aberration score (CKAS. The SCCN and CKAS were used to analyse the extend and the spectrum of additional chromosome aberrations in 94 patients with Philadelphia chromosome positive (Ph-positive acute lymphoblastic leukemia (ALL and secondary chromosome anomalies. Dosage changes of chromosomal material represented 92.1% of all additional events. Recurring regions of chromosome losses were identified. Structural rearrangements affecting (pericentromeric chromosome regions were recorded in 24.6% of the cases. Conclusions SCCN and CKAS provide unifying elements between karyotypes and computer processable data formats. They proved to be useful in the investigation of additional chromosome aberrations in Ph-positive ALL, and may represent a step towards full automation of the analysis of large and complex karyotype databases.

  12. Separate valuation subsystems for delay and effort decision costs.

    Science.gov (United States)

    Prévost, Charlotte; Pessiglione, Mathias; Météreau, Elise; Cléry-Melin, Marie-Laure; Dreher, Jean-Claude

    2010-10-20

    Decision making consists of choosing among available options on the basis of a valuation of their potential costs and benefits. Most theoretical models of decision making in behavioral economics, psychology, and computer science propose that the desirability of outcomes expected from alternative options can be quantified by utility functions. These utility functions allow a decision maker to assign subjective values to each option under consideration by weighting the likely benefits and costs resulting from an action and to select the one with the highest subjective value. Here, we used model-based neuroimaging to test whether the human brain uses separate valuation systems for rewards (erotic stimuli) associated with different types of costs, namely, delay and effort. We show that humans devalue rewards associated with physical effort in a strikingly similar fashion to those they devalue that are associated with delays, and that a single computational model derived from economics theory can account for the behavior observed in both delay discounting and effort discounting. However, our neuroimaging data reveal that the human brain uses distinct valuation subsystems for different types of costs, reflecting in opposite fashion delayed reward and future energetic expenses. The ventral striatum and the ventromedial prefrontal cortex represent the increasing subjective value of delayed rewards, whereas a distinct network, composed of the anterior cingulate cortex and the anterior insula, represent the decreasing value of the effortful option, coding the expected expense of energy. Together, these data demonstrate that the valuation processes underlying different types of costs can be fractionated at the cerebral level.

  13. The RETRAN-03 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; McFadden, J.H.; Peterson, C.E.; McClure, J.A.; Gose, G.C.; Jensen, P.J.

    1991-01-01

    The RETRAN-03 code development effort is designed to overcome the major theoretical and practical limitations associated with the RETRAN-02 computer code. The major objectives of the development program are to extend the range of analyses that can be performed with RETRAN, to make the code more dependable and faster running, and to have a more transportable code. The first two objectives are accomplished by developing new models and adding other models to the RETRAN-02 base code. The major model additions for RETRAN-03 are as follows: implicit solution methods for the steady-state and transient forms of the field equations; additional options for the velocity difference equation; a new steady-state initialization option for computer low-power steam generator initial conditions; models for nonequilibrium thermodynamic conditions; and several special-purpose models. The source code and the environmental library for RETRAN-03 are written in standard FORTRAN 77, which allows the last objective to be fulfilled. Some models in RETRAN-02 have been deleted in RETRAN-03. In this paper the changes between RETRAN-02 and RETRAN-03 are reviewed

  14. Discounting the value of safety: effects of perceived risk and effort.

    Science.gov (United States)

    Sigurdsson, Sigurdur O; Taylor, Matthew A; Wirth, Oliver

    2013-09-01

    Although falls from heights remain the most prevalent cause of fatalities in the construction industry, factors impacting safety-related choices associated with work at heights are not completely understood. Better tools are needed to identify and study the factors influencing safety-related choices and decision making. Using a computer-based task within a behavioral economics paradigm, college students were presented a choice between two hypothetical scenarios that differed in working height and effort associated with retrieving and donning a safety harness. Participants were instructed to choose the scenario in which they were more likely to wear the safety harness. Based on choice patterns, switch points were identified, indicating when the perceived risk in both scenarios was equivalent. Switch points were a systematic function of working height and effort, and the quantified relation between perceived risk and effort was described well by a hyperbolic equation. Choice patterns revealed that the perceived risk of working at heights decreased as the effort to retrieve and don a safety harness increased. Results contribute to the development of computer-based procedure for assessing risk discounting within a behavioral economics framework. Such a procedure can be used as a research tool to study factors that influence safety-related decision making with a goal of informing more effective prevention and intervention strategies. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  15. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  16. Private Speech Moderates the Effects of Effortful Control on Emotionality

    Science.gov (United States)

    Day, Kimberly L.; Smith, Cynthia L.; Neal, Amy; Dunsmore, Julie C.

    2018-01-01

    Research Findings: In addition to being a regulatory strategy, children's private speech may enhance or interfere with their effortful control used to regulate emotion. The goal of the current study was to investigate whether children's private speech during a selective attention task moderated the relations of their effortful control to their…

  17. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  18. Affective medicine. A review of affective computing efforts in medical informatics.

    Science.gov (United States)

    Luneski, A; Konstantinidis, E; Bamidis, P D

    2010-01-01

    Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as "computing that relates to, arises from, or deliberately influences emotions". AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field.

  19. Distributed computing for global health

    CERN Multimedia

    CERN. Geneva; Schwede, Torsten; Moore, Celia; Smith, Thomas E; Williams, Brian; Grey, François

    2005-01-01

    Distributed computing harnesses the power of thousands of computers within organisations or over the Internet. In order to tackle global health problems, several groups of researchers have begun to use this approach to exceed by far the computing power of a single lab. This event illustrates how companies, research institutes and the general public are contributing their computing power to these efforts, and what impact this may have on a range of world health issues. Grids for neglected diseases Vincent Breton, CNRS/EGEE This talk introduces the topic of distributed computing, explaining the similarities and differences between Grid computing, volunteer computing and supercomputing, and outlines the potential of Grid computing for tackling neglected diseases where there is little economic incentive for private R&D efforts. Recent results on malaria drug design using the Grid infrastructure of the EU-funded EGEE project, which is coordinated by CERN and involves 70 partners in Europe, the US and Russi...

  20. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  1. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  2. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  3. Perception of effort in Exercise Science: Definition, measurement and perspectives.

    Science.gov (United States)

    Pageaux, Benjamin

    2016-11-01

    Perception of effort, also known as perceived exertion or sense of effort, can be described as a cognitive feeling of work associated with voluntary actions. The aim of the present review is to provide an overview of what is perception of effort in Exercise Science. Due to the addition of sensations other than effort in its definition, the neurophysiology of perceived exertion remains poorly understood. As humans have the ability to dissociate effort from other sensations related to physical exercise, the need to use a narrower definition is emphasised. Consequently, a definition and some brief guidelines for its measurement are provided. Finally, an overview of the models present in the literature aiming to explain its neurophysiology, and some perspectives for future research are offered.

  4. Robotic disaster recovery efforts with ad-hoc deployable cloud computing

    Science.gov (United States)

    Straub, Jeremy; Marsh, Ronald; Mohammad, Atif F.

    2013-06-01

    Autonomous operations of search and rescue (SaR) robots is an ill posed problem, which is complexified by the dynamic disaster recovery environment. In a typical SaR response scenario, responder robots will require different levels of processing capabilities during various parts of the response effort and will need to utilize multiple algorithms. Placing these capabilities onboard the robot is a mediocre solution that precludes algorithm specific performance optimization and results in mediocre performance. Architecture for an ad-hoc, deployable cloud environment suitable for use in a disaster response scenario is presented. Under this model, each service provider is optimized for the task and maintains a database of situation-relevant information. This service-oriented architecture (SOA 3.0) compliant framework also serves as an example of the efficient use of SOA 3.0 in an actual cloud application.

  5. Computer Modeling of Radiation Portal Monitors for Homeland Security Applications

    International Nuclear Information System (INIS)

    Pagh, Richard T.; Kouzes, Richard T.; McConn, Ronald J.; Robinson, Sean M.; Schweppe, John E.; Siciliano, Edward R.

    2005-01-01

    Radiation Portal Monitors (RPMs) are currently being used at our nation's borders to detect potential nuclear threats. At the Pacific Northwest National Laboratory (PNNL), realistic computer models of RPMs are being developed to simulate the screening of vehicles and cargo. Detailed models of the detection equipment, vehicles, cargo containers, cargos, and radioactive sources are being used to determine the optimal configuration of detectors. These models can also be used to support work to optimize alarming algorithms so that they maximize sensitivity for items of interest while minimizing nuisance alarms triggered by legitimate radioactive material in the commerce stream. Proposed next-generation equipment is also being modeled to quantify performance and capability improvements to detect potential nuclear threats. A discussion of the methodology used to perform computer modeling for RPMs will be provided. In addition, the efforts to validate models used to perform these scenario analyses will be described. Finally, areas where improved modeling capability is needed will be discussed as a guide to future development efforts

  6. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  7. How hearing aids, background noise, and visual cues influence objective listening effort.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2013-09-01

    The purpose of this article was to evaluate factors that influence the listening effort experienced when processing speech for people with hearing loss. Specifically, the change in listening effort resulting from introducing hearing aids, visual cues, and background noise was evaluated. An additional exploratory aim was to investigate the possible relationships between the magnitude of listening effort change and individual listeners' working memory capacity, verbal processing speed, or lipreading skill. Twenty-seven participants with bilateral sensorineural hearing loss were fitted with linear behind-the-ear hearing aids and tested using a dual-task paradigm designed to evaluate listening effort. The primary task was monosyllable word recognition and the secondary task was a visual reaction time task. The test conditions varied by hearing aids (unaided, aided), visual cues (auditory-only, auditory-visual), and background noise (present, absent). For all participants, the signal to noise ratio was set individually so that speech recognition performance in noise was approximately 60% in both the auditory-only and auditory-visual conditions. In addition to measures of listening effort, working memory capacity, verbal processing speed, and lipreading ability were measured using the Automated Operational Span Task, a Lexical Decision Task, and the Revised Shortened Utley Lipreading Test, respectively. In general, the effects measured using the objective measure of listening effort were small (~10 msec). Results indicated that background noise increased listening effort, and hearing aids reduced listening effort, while visual cues did not influence listening effort. With regard to the individual variables, verbal processing speed was negatively correlated with hearing aid benefit for listening effort; faster processors were less likely to derive benefit. Working memory capacity, verbal processing speed, and lipreading ability were related to benefit from visual cues. No

  8. Summary of computational support and general documentation for computer code (GENTREE) used in Office of Nuclear Waste Isolation Pilot Salt Site Selection Project

    International Nuclear Information System (INIS)

    Beatty, J.A.; Younker, J.L.; Rousseau, W.F.; Elayat, H.A.

    1983-01-01

    A Decision Tree Computer Model was adapted for the purposes of a Pilot Salt Site Selection Project conducted by the Office of Nuclear Waste Isolation (ONWI). A deterministic computer model was developed to structure the site selection problem with submodels reflecting the five major outcome categories (Cost, Safety, Delay, Environment, Community Impact) to be evaluated in the decision process. Time-saving modifications were made in the tree code as part of the effort. In addition, format changes allowed retention of information items which are valuable in directing future research and in isolation of key variabilities in the Site Selection Decision Model. The deterministic code was linked to the modified tree code and the entire program was transferred to the ONWI-VAX computer for future use by the ONWI project

  9. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  10. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    Science.gov (United States)

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  11. Global Data Grid Efforts for ATLAS

    CERN Multimedia

    Gardner, R.

    2001-01-01

    Over the past two years computational data grids have emerged as a promising new technology for large scale, data-intensive computing required by the LHC experiments, as outlined by the recent "Hoffman" review panel that addressed the LHC computing challenge. The problem essentially is to seamlessly link physicists to petabyte-scale data and computing resources, distributed worldwide, and connected by high-bandwidth research networks. Several new collaborative initiatives in Europe, the United States, and Asia have formed to address the problem. These projects are of great interest to ATLAS physicists and software developers since their objective is to offer tools that can be integrated into the core ATLAS application framework for distributed event reconstruction, Monte Carlo simulation, and data analysis, making it possible for individuals and groups of physicists to share information, data, and computing resources in new ways and at scales not previously attempted. In addition, much of the distributed IT...

  12. Effect of social influence on effort-allocation for monetary rewards.

    Science.gov (United States)

    Gilman, Jodi M; Treadway, Michael T; Curran, Max T; Calderon, Vanessa; Evins, A Eden

    2015-01-01

    Though decades of research have shown that people are highly influenced by peers, few studies have directly assessed how the value of social conformity is weighed against other types of costs and benefits. Using an effort-based decision-making paradigm with a novel social influence manipulation, we measured how social influence affected individuals' decisions to allocate effort for monetary rewards during trials with either high or low probability of receiving a reward. We found that information about the effort-allocation of peers modulated participant choices, specifically during conditions of low probability of obtaining a reward. This suggests that peer influence affects effort-based choices to obtain rewards especially under conditions of risk. This study provides evidence that people value social conformity in addition to other costs and benefits when allocating effort, and suggests that neuroeconomic studies that assess trade-offs between effort and reward should consider social environment as a factor that can influence decision-making.

  13. Effect of social influence on effort-allocation for monetary rewards.

    Directory of Open Access Journals (Sweden)

    Jodi M Gilman

    Full Text Available Though decades of research have shown that people are highly influenced by peers, few studies have directly assessed how the value of social conformity is weighed against other types of costs and benefits. Using an effort-based decision-making paradigm with a novel social influence manipulation, we measured how social influence affected individuals' decisions to allocate effort for monetary rewards during trials with either high or low probability of receiving a reward. We found that information about the effort-allocation of peers modulated participant choices, specifically during conditions of low probability of obtaining a reward. This suggests that peer influence affects effort-based choices to obtain rewards especially under conditions of risk. This study provides evidence that people value social conformity in addition to other costs and benefits when allocating effort, and suggests that neuroeconomic studies that assess trade-offs between effort and reward should consider social environment as a factor that can influence decision-making.

  14. Computational Modeling for Enhancing Soft Tissue Image Guided Surgery: An Application in Neurosurgery.

    Science.gov (United States)

    Miga, Michael I

    2016-01-01

    With the recent advances in computing, the opportunities to translate computational models to more integrated roles in patient treatment are expanding at an exciting rate. One area of considerable development has been directed towards correcting soft tissue deformation within image guided neurosurgery applications. This review captures the efforts that have been undertaken towards enhancing neuronavigation by the integration of soft tissue biomechanical models, imaging and sensing technologies, and algorithmic developments. In addition, the review speaks to the evolving role of modeling frameworks within surgery and concludes with some future directions beyond neurosurgical applications.

  15. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    Science.gov (United States)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  16. Career Opportunities in Computer Graphics.

    Science.gov (United States)

    Langer, Victor

    1983-01-01

    Reviews the impact of computer graphics on industrial productivity. Details the computer graphics technician curriculum at Milwaukee Area Technical College and the cooperative efforts of business and industry to fund and equip the program. (SK)

  17. Computer programs in BASIC language for graphite furnace atomic absorption using the method of additions. Part 2. Documentation

    International Nuclear Information System (INIS)

    Boyle, W.G. Jr.; Ryan, D.P.

    1979-08-01

    There are four computer programs, written in the BASIC language, used for taking and processing data from an atomic absorption spectrophotometer using the graphite furnace and the method of additions for calibration. The programs chain to each other and are divided into logical sections that have been flow-charted. The chaining sequences, general features, structure, order of subroutines and functions, and the storage of data are discussed. In addition, variables are listed and defined, and a complete listing of each program with a symbol occurrence table is provided

  18. [Earth Science Technology Office's Computational Technologies Project

    Science.gov (United States)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  19. The Effects of Hearing Aid Directional Microphone and Noise Reduction Processing on Listening Effort in Older Adults with Hearing Loss.

    Science.gov (United States)

    Desjardins, Jamie L

    2016-01-01

    Older listeners with hearing loss may exert more cognitive resources to maintain a level of listening performance similar to that of younger listeners with normal hearing. Unfortunately, this increase in cognitive load, which is often conceptualized as increased listening effort, may come at the cost of cognitive processing resources that might otherwise be available for other tasks. The purpose of this study was to evaluate the independent and combined effects of a hearing aid directional microphone and a noise reduction (NR) algorithm on reducing the listening effort older listeners with hearing loss expend on a speech-in-noise task. Participants were fitted with study worn commercially available behind-the-ear hearing aids. Listening effort on a sentence recognition in noise task was measured using an objective auditory-visual dual-task paradigm. The primary task required participants to repeat sentences presented in quiet and in a four-talker babble. The secondary task was a digital visual pursuit rotor-tracking test, for which participants were instructed to use a computer mouse to track a moving target around an ellipse that was displayed on a computer screen. Each of the two tasks was presented separately and concurrently at a fixed overall speech recognition performance level of 50% correct with and without the directional microphone and/or the NR algorithm activated in the hearing aids. In addition, participants reported how effortful it was to listen to the sentences in quiet and in background noise in the different hearing aid listening conditions. Fifteen older listeners with mild sloping to severe sensorineural hearing loss participated in this study. Listening effort in background noise was significantly reduced with the directional microphones activated in the hearing aids. However, there was no significant change in listening effort with the hearing aid NR algorithm compared to no noise processing. Correlation analysis between objective and self

  20. Zero Effort Technologies Considerations, Challenges, and Use in Health, Wellness, and Rehabilitation

    CERN Document Server

    Mihailidis, Alex; Hoey, Jesse

    2011-01-01

    This book introduces zero-effort technologies (ZETs), an emerging class of technology that requires little or no effort from the people who use it. ZETs use advanced techniques, such as computer vision, sensor fusion, decision-making and planning, and machine learning to autonomously operate through the collection, analysis, and application of data about the user and his/her context. This book gives an overview of ZETs, presents concepts in the development of pervasive intelligent technologies and environments for health and rehabilitation, along with an in-depth discussion of the design princ

  1. Computational geomechanics and applications at Sandia National Laboratories

    International Nuclear Information System (INIS)

    Arguello, Jose Guadalupe Jr.

    2010-01-01

    Sandia National Laboratories (SNL) is a multi-program national laboratory in the business of national security, whose primary mission is nuclear weapons (NW). It is a prime contractor to the USDOE, operating under the NNSA and is one of the three NW national laboratories. It has a long history of involvement in the area of geomechanics, starting with the some of the earliest weapons tests at Nevada. Projects in which geomechanics support (in general) and computational geomechanics support (in particular) are at the forefront at Sandia, range from those associated with civilian programs to those in the defense programs. SNL has had significant involvement and participation in the Waste Isolation Pilot Plant (low-level defense nuclear waste), the Yucca Mountain Project (formerly proposed for commercial spent fuel and high-level nuclear waste), and the Strategic Petroleum Reserve (the nation's emergency petroleum store). In addition, numerous industrial partners seek-out our computational/geomechanics expertise, and there are efforts in compressed air and natural gas storage, as well as in CO 2 Sequestration. Likewise, there have also been collaborative past efforts in the areas of compactable reservoir response, the response of salt structures associated with reservoirs, and basin modeling for the Oil and Gas industry. There are also efforts on the defense front, ranging from assessment of vulnerability of infrastructure to defeat of hardened targets, which require an understanding and application of computational geomechanics. Several examples from some of these areas will be described and discussed to give the audience a flavor of the type of work currently being performed at Sandia in the general area of geomechanics.

  2. Characterization of Metal Powders Used for Additive Manufacturing.

    Science.gov (United States)

    Slotwinski, J A; Garboczi, E J; Stutzman, P E; Ferraris, C F; Watson, S S; Peltz, M A

    2014-01-01

    Additive manufacturing (AM) techniques can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process.

  3. Characterization of Metal Powders Used for Additive Manufacturing

    Science.gov (United States)

    Slotwinski, JA; Garboczi, EJ; Stutzman, PE; Ferraris, CF; Watson, SS; Peltz, MA

    2014-01-01

    Additive manufacturing (AM) techniques1 can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process. PMID:26601040

  4. Model Additional Protocol

    International Nuclear Information System (INIS)

    Rockwood, Laura

    2001-01-01

    Since the end of the cold war a series of events has changed the circumstances and requirements of the safeguards system. The discovery of a clandestine nuclear weapons program in Iraq, the continuing difficulty in verifying the initial report of Democratic People's Republic of Korea upon entry into force of their safeguards agreement, and the decision of the South African Government to give up its nuclear weapons program and join the Treaty on the Non-Proliferation of Nuclear Weapons have all played a role in an ambitious effort by IAEA Member States and the Secretariat to strengthen the safeguards system. A major milestone in this effort was reached in May 1997 when the IAEA Board of Governors approved a Model Protocol Additional to Safeguards Agreements. The Model Additional Protocol was negotiated over a period of less than a year by an open-ended committee of the Board involving some 70 Member States and two regional inspectorates. The IAEA is now in the process of negotiating additional protocols, State by State, and implementing them. These additional protocols will provide the IAEA with rights of access to information about all activities related to the use of nuclear material in States with comprehensive safeguards agreements and greatly expanded physical access for IAEA inspectors to confirm or verify this information. In conjunction with this, the IAEA is working on the integration of these measures with those provided for in comprehensive safeguards agreements, with a view to maximizing the effectiveness and efficiency, within available resources, the implementation of safeguards. Details concerning the Model Additional Protocol are given. (author)

  5. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  6. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Science.gov (United States)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  7. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Energy Technology Data Exchange (ETDEWEB)

    King, W. E., E-mail: weking@llnl.gov [Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A. [Engineering Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Kamath, C. [Computation Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Rubenchik, A. M. [NIF and Photon Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States)

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  8. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    International Nuclear Information System (INIS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-01-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process

  9. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  10. Context-dependent memory decay is evidence of effort minimization in motor learning: a computational study.

    Science.gov (United States)

    Takiyama, Ken

    2015-01-01

    Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this transfer of the learning effect can be reproduced by certain theoretical frameworks. Although most theoretical frameworks have assumed that a motor memory trained with a certain movement decays at the same speed during performing the trained movement as non-trained movements, a recent study reported that the motor memory decays faster during performing the trained movement than non-trained movements, i.e., the decay rate of motor memory is movement or context dependent. Although motor learning has been successfully modeled based on an optimization framework, e.g., movement error minimization, the type of optimization that can lead to context-dependent memory decay is unclear. Thus, context-dependent memory decay raises the question of what is optimized in motor learning. To reproduce context-dependent memory decay, I extend a motor primitive framework. Specifically, I introduce motor effort optimization into the framework because some previous studies have reported the existence of effort optimization in motor learning processes and no conventional motor primitive model has yet considered the optimization. Here, I analytically and numerically revealed that context-dependent decay is a result of motor effort optimization. My analyses suggest that context-dependent decay is not merely memory decay but is evidence of motor effort optimization in motor learning.

  11. Context-dependent memory decay is evidence of effort minimization in motor learning: A computational study

    Directory of Open Access Journals (Sweden)

    Ken eTakiyama

    2015-02-01

    Full Text Available Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this transfer of the learning effect can be reproduced by certain theoretical frameworks. Although most theoretical frameworks have assumed that a motor memory trained with a certain movement decays at the same speed during performing the trained movement as non-trained movements, a recent study reported that the motor memory decays faster during performing the trained movement than non-trained movements, i.e., the decay rate of motor memory is movement or context dependent. Although motor learning has been successfully modeled based on an optimization framework, e.g., movement error minimization, the type of optimization that can lead to context-dependent memory decay is unclear. Thus, context-dependent memory decay raises the question of what is optimized in motor learning. To reproduce context-dependent memory decay, I extend a motor primitive framework. Specifically, I introduce motor effort optimization into the framework because some previous studies have reported the existence of effort optimization in motor learning processes and no conventional motor primitive model has yet considered the optimization. Here, I analytically and numerically revealed that context-dependent decay is a result of motor effort optimization. My analyses suggest that context-dependent decay is not merely memory decay but is evidence of motor effort optimization in motor learning.

  12. Children’s Sleep and Academic Achievement: The Moderating Role of Effortful Control

    Science.gov (United States)

    Diaz, Anjolii; Berger, Rebecca; Valiente, Carlos; Eisenberg, Nancy; VanSchyndel, Sarah; Tao, Chun; Spinrad, Tracy L.; Doane, Leah D.; Thompson, Marilyn S.; Silva, Kassondra M.; Southworth, Jody

    2016-01-01

    Poor sleep is thought to interfere with children’s learning and academic achievement (AA). However, existing research and theory indicate there are factors that may mitigate the academic risk associated with poor sleep. The purpose of this study was to examine the moderating role of children’s effortful control (EC) on the relation between sleep and AA in young children. One hundred and three 4.5- to 7-year-olds (M = 5.98 years, SD = 0.61) wore a wrist-based actigraph for five continuous weekday nights. Teachers and coders reported on children’s EC. EC was also assessed with a computer-based task at school. Additionally, we obtained a standardized measure of children’s AA. There was a positive main effect of sleep efficiency to AA. Several relations between sleep and AA were moderated by EC and examination of the simple slopes indicated that the negative relation between sleep and AA was only significant at low levels of EC. PMID:28255190

  13. Using a cloud to replenish parched groundwater modeling efforts

    Science.gov (United States)

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  14. Using a cloud to replenish parched groundwater modeling efforts.

    Science.gov (United States)

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  15. Teachable Agents and the Protege Effect: Increasing the Effort towards Learning

    Science.gov (United States)

    Chase, Catherine C.; Chin, Doris B.; Oppezzo, Marily A.; Schwartz, Daniel L.

    2009-01-01

    Betty's Brain is a computer-based learning environment that capitalizes on the social aspects of learning. In Betty's Brain, students instruct a character called a Teachable Agent (TA) which can reason based on how it is taught. Two studies demonstrate the "protege effect": students make greater effort to learn for their TAs than they do…

  16. Hydrogen economy: a little bit more effort

    International Nuclear Information System (INIS)

    Pauron, M.

    2008-01-01

    In few years, the use of hydrogen in economy has become a credible possibility. Today, billions of euros are invested in the hydrogen industry which is strengthened by technological advances in fuel cells development and by an increasing optimism. However, additional research efforts and more financing will be necessary to make the dream of an hydrogen-based economy a reality

  17. Computations on the primary photoreaction of Br2 with CO2: stepwise vs concerted addition of Br atoms.

    Science.gov (United States)

    Xu, Kewei; Korter, Timothy M; Braiman, Mark S

    2015-04-09

    It was proposed previously that Br2-sensitized photolysis of liquid CO2 proceeds through a metastable primary photoproduct, CO2Br2. Possible mechanisms for such a photoreaction are explored here computationally. First, it is shown that the CO2Br radical is not stable in any geometry. This rules out a free-radical mechanism, for example, photochemical splitting of Br2 followed by stepwise addition of Br atoms to CO2-which in turn accounts for the lack of previously observed Br2+CO2 photochemistry in gas phases. A possible alternative mechanism in liquid phase is formation of a weakly bound CO2:Br2 complex, followed by concerted photoaddition of Br2. This hypothesis is suggested by the previously published spectroscopic detection of a binary CO2:Br2 complex in the supersonically cooled gas phase. We compute a global binding-energy minimum of -6.2 kJ mol(-1) for such complexes, in a linear geometry. Two additional local minima were computed for perpendicular (C2v) and nearly parallel asymmetric planar geometries, both with binding energies near -5.4 kJ mol(-1). In these two latter geometries, C-Br and O-Br bond distances are simultaneously in the range of 3.5-3.8 Å, that is, perhaps suitable for a concerted photoaddition under the temperature and pressure conditions where Br2 + CO2 photochemistry has been observed.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  20. Preparing Future Secondary Computer Science Educators

    Science.gov (United States)

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  1. Control and Effort Costs Influence the Motivational Consequences of Choice

    Directory of Open Access Journals (Sweden)

    Holly Sullivan-Toole

    2017-05-01

    Full Text Available The act of making a choice, apart from any outcomes the choice may yield, has, paradoxically, been linked to both the enhancement and the detriment of intrinsic motivation. Research has implicated two factors in potentially mediating these contradictory effects: the personal control conferred by a choice and the costs associated with a choice. Across four experiments, utilizing a physical effort task disguised as a simple video game, we systematically varied costs across two levels of physical effort requirements (Low-Requirement, High-Requirement and control over effort costs across three levels of choice (Free-Choice, Restricted-Choice, and No-Choice to disambiguate how these factors affect the motivational consequences of choosing within an effortful task. Together, our results indicated that, in the face of effort requirements, illusory control alone may not sufficiently enhance perceptions of personal control to boost intrinsic motivation; rather, the experience of actual control may be necessary to overcome effort costs and elevate performance. Additionally, we demonstrated that conditions of illusory control, while otherwise unmotivating, can through association with the experience of free-choice, be transformed to have a positive effect on motivation.

  2. Green Computing in Local Governments and Information Technology Companies

    Directory of Open Access Journals (Sweden)

    Badar Agung Nugroho

    2013-06-01

    Full Text Available Green computing is a study and practice of designing, manufacturing, using, and disposing of information and communication devices efficiently and effectively with minimum impact on the environment. If the green computing concept was implemented, it will help the agencies or companies to reduce energy and capital cost from their IT infrastructure. The goal from this research is to explore the current condition about the efforts from local governments and IT companies at West Java to implement the green computing concept at their working environment. The primary data were collected by using focus group discussion by inviting the local governments and IT companies representatives who responsible to manage their IT infrastructure. And then, the secondary data were collected by doing brief observation in order to see the real effort of green computing implementation at each institution. The result shows that there are many different perspectives and efforts of green computing implementation between local governments and IT companies.

  3. Development of computer code SIMPSEX for simulation of FBR fuel reprocessing flowsheets: II. additional benchmarking results

    International Nuclear Information System (INIS)

    Shekhar Kumar; Koganti, S.B.

    2003-07-01

    Benchmarking and application of a computer code SIMPSEX for high plutonium FBR flowsheets was reported recently in an earlier report (IGC-234). Improvements and recompilation of the code (Version 4.01, March 2003) required re-validation with the existing benchmarks as well as additional benchmark flowsheets. Improvements in the high Pu region (Pu Aq >30 g/L) resulted in better results in the 75% Pu flowsheet benchmark. Below 30 g/L Pu Aq concentration, results were identical to those from the earlier version (SIMPSEX Version 3, code compiled in 1999). In addition, 13 published flowsheets were taken as additional benchmarks. Eleven of these flowsheets have a wide range of feed concentrations and few of them are β-γ active runs with FBR fuels having a wide distribution of burnup and Pu ratios. A published total partitioning flowsheet using externally generated U(IV) was also simulated using SIMPSEX. SIMPSEX predictions were compared with listed predictions from conventional SEPHIS, PUMA, PUNE and PUBG. SIMPSEX results were found to be comparable and better than the result from above listed codes. In addition, recently reported UREX demo results along with AMUSE simulations are also compared with SIMPSEX predictions. Results of the benchmarking SIMPSEX with these 14 benchmark flowsheets are discussed in this report. (author)

  4. Comparison of cardiovascular response to combined static-dynamic effort, postprandial dynamic effort and dynamic effort alone in patients with chronic ischemic heart disease

    International Nuclear Information System (INIS)

    Hung, J.; McKillip, J.; Savin, W.; Magder, S.; Kraus, R.; Houston, N.; Goris, M.; Haskell, W.; DeBusk, R.

    1982-01-01

    The cardiovascular responses to combined static-dynamic effort, postprandial dynamic effort and dynamic effort alone were evaluated by upright bicycle ergometry during equilibrium-gated blood pool scintigraphy in 24 men, mean age 59 +/- 8 years, with chronic ischemic heart disease. Combined static-dynamic effort and the postprandial state elicited a peak cardiovascular response similar to that of dynamic effort alone. Heart rate, intraarterial systolic and diastolic pressures, rate-pressure product and ejection fraction were similar for the three test conditions at the onset of ischemia and at peak effort. The prevalence and extent of exercise-induced ischemic left ventricular dysfunction, ST-segment depression, angina pectoris and ventricular ectopic activity were also similar during the three test conditions. Direct and indirect measurements of systolic and diastolic blood pressure were highly correlated. The onset of ischemic ST-segment depression and angina pectoris correlated as strongly with heart rate alone as with the rate-pressure product during all three test conditions. The cardiovascular response to combined static-dynamic effort and to postprandial dynamic effort becomes more similar to that of dynamic effort alone as dynamic effort reaches a symptom limit. If significant ischemic and arrhythmic abnormalities are absent during symptom-limited dynamic exercise testing, they are unlikely to appear during combined static-dynamic or postprandial dynamic effort

  5. Computers in Schools: White Boys Only?

    Science.gov (United States)

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  6. SU-F-J-219: Predicting Ventilation Change Due to Radiation Therapy: Dependency On Pre-RT Ventilation and Effort Correction

    Energy Technology Data Exchange (ETDEWEB)

    Patton, T; Du, K; Bayouth, J [University of Wisconsin, Madison, WI (United States); Christensen, G; Reinhardt, J [University of Iowa, Iowa City, IA (United States)

    2016-06-15

    Purpose: Ventilation change caused by radiation therapy (RT) can be predicted using four-dimensional computed tomography (4DCT) and image registration. This study tested the dependency of predicted post-RT ventilation on effort correction and pre-RT lung function. Methods: Pre-RT and 3 month post-RT 4DCT images were obtained for 13 patients. The 4DCT images were used to create ventilation maps using a deformable image registration based Jacobian expansion calculation. The post-RT ventilation maps were predicted in four different ways using the dose delivered, pre-RT ventilation, and effort correction. The pre-RT ventilation and effort correction were toggled to determine dependency. The four different predicted ventilation maps were compared to the post-RT ventilation map calculated from image registration to establish the best prediction method. Gamma pass rates were used to compare the different maps with the criteria of 2mm distance-to-agreement and 6% ventilation difference. Paired t-tests of gamma pass rates were used to determine significant differences between the maps. Additional gamma pass rates were calculated using only voxels receiving over 20 Gy. Results: The predicted post-RT ventilation maps were in agreement with the actual post-RT maps in the following percentage of voxels averaged over all subjects: 71% with pre-RT ventilation and effort correction, 69% with no pre-RT ventilation and effort correction, 60% with pre-RT ventilation and no effort correction, and 58% with no pre-RT ventilation and no effort correction. When analyzing only voxels receiving over 20 Gy, the gamma pass rates were respectively 74%, 69%, 65%, and 55%. The prediction including both pre- RT ventilation and effort correction was the only prediction with significant improvement over using no prediction (p<0.02). Conclusion: Post-RT ventilation is best predicted using both pre-RT ventilation and effort correction. This is the only prediction that provided a significant

  7. Computer networks and advanced communications

    International Nuclear Information System (INIS)

    Koederitz, W.L.; Macon, B.S.

    1992-01-01

    One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user

  8. Large Scale Computing and Storage Requirements for High Energy Physics

    International Nuclear Information System (INIS)

    Gerber, Richard A.; Wasserman, Harvey

    2010-01-01

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  9. The role of cognitive effort in subjective reward devaluation and risky decision-making.

    Science.gov (United States)

    Apps, Matthew A J; Grima, Laura L; Manohar, Sanjay; Husain, Masud

    2015-11-20

    Motivation is underpinned by cost-benefit valuations where costs-such as physical effort or outcome risk-are subjectively weighed against available rewards. However, in many environments risks pertain not to the variance of outcomes, but to variance in the possible levels of effort required to obtain rewards (effort risks). Moreover, motivation is often guided by the extent to which cognitive-not physical-effort devalues rewards (effort discounting). Yet, very little is known about the mechanisms that underpin the influence of cognitive effort risks or discounting on motivation. We used two cost-benefit decision-making tasks to probe subjective sensitivity to cognitive effort (number of shifts of spatial attention) and to effort risks. Our results show that shifts of spatial attention when monitoring rapidly presented visual stimuli are perceived as effortful and devalue rewards. Additionally, most people are risk-averse, preferring safe, known amounts of effort over risky offers. However, there was no correlation between their effort and risk sensitivity. We show for the first time that people are averse to variance in the possible amount of cognitive effort to be exerted. These results suggest that cognitive effort sensitivity and risk sensitivity are underpinned by distinct psychological and neurobiological mechanisms.

  10. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  11. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yang, Jen-Hau; Rotolo, Renee; Presby, Rose

    2018-01-01

    Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA) system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson's disease). Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  12. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research

    Directory of Open Access Journals (Sweden)

    John D. Salamone

    2018-03-01

    Full Text Available Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson’s disease. Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  13. Quantum Computation--The Ultimate Frontier

    OpenAIRE

    Adami, Chris; Dowling, Jonathan P.

    2002-01-01

    The discovery of an algorithm for factoring which runs in polynomial time on a quantum computer has given rise to a concerted effort to understand the principles, advantages, and limitations of quantum computing. At the same time, many different quantum systems are being explored for their suitability to serve as a physical substrate for the quantum computer of the future. I discuss some of the theoretical foundations of quantum computer science, including algorithms and error correction, and...

  14. Parallel quantum computing in a single ensemble quantum computer

    International Nuclear Information System (INIS)

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  15. Analysis Efforts Supporting NSTX Upgrades

    International Nuclear Information System (INIS)

    Zhang, H.; Titus, P.; Rogoff, P.; Zolfaghari, A.; Mangra, D.; Smith, M.

    2010-01-01

    The National Spherical Torus Experiment (NSTX) is a low aspect ratio, spherical torus (ST) configuration device which is located at Princeton Plasma Physics Laboratory (PPPL) This device is presently being updated to enhance its physics by doubling the TF field to 1 Tesla and increasing the plasma current to 2 Mega-amperes. The upgrades include a replacement of the centerstack and addition of a second neutral beam. The upgrade analyses have two missions. The first is to support design of new components, principally the centerstack, the second is to qualify existing NSTX components for higher loads, which will increase by a factor of four. Cost efficiency was a design goal for new equipment qualification, and reanalysis of the existing components. Showing that older components can sustain the increased loads has been a challenging effort in which designs had to be developed that would limit loading on weaker components, and would minimize the extent of modifications needed. Two areas representing this effort have been chosen to describe in more details: analysis of the current distribution in the new TF inner legs, and, second, analysis of the out-of-plane support of the existing TF outer legs.

  16. Estimation of inspection effort

    International Nuclear Information System (INIS)

    Mullen, M.F.; Wincek, M.A.

    1979-06-01

    An overview of IAEA inspection activities is presented, and the problem of evaluating the effectiveness of an inspection is discussed. Two models are described - an effort model and an effectiveness model. The effort model breaks the IAEA's inspection effort into components; the amount of effort required for each component is estimated; and the total effort is determined by summing the effort for each component. The effectiveness model quantifies the effectiveness of inspections in terms of probabilities of detection and quantities of material to be detected, if diverted over a specific period. The method is applied to a 200 metric ton per year low-enriched uranium fuel fabrication facility. A description of the model plant is presented, a safeguards approach is outlined, and sampling plans are calculated. The required inspection effort is estimated and the results are compared to IAEA estimates. Some other applications of the method are discussed briefly. Examples are presented which demonstrate how the method might be useful in formulating guidelines for inspection planning and in establishing technical criteria for safeguards implementation

  17. Multiple Embedded Processors for Fault-Tolerant Computing

    Science.gov (United States)

    Bolotin, Gary; Watson, Robert; Katanyoutanant, Sunant; Burke, Gary; Wang, Mandy

    2005-01-01

    A fault-tolerant computer architecture has been conceived in an effort to reduce vulnerability to single-event upsets (spurious bit flips caused by impingement of energetic ionizing particles or photons). As in some prior fault-tolerant architectures, the redundancy needed for fault tolerance is obtained by use of multiple processors in one computer. Unlike prior architectures, the multiple processors are embedded in a single field-programmable gate array (FPGA). What makes this new approach practical is the recent commercial availability of FPGAs that are capable of having multiple embedded processors. A working prototype (see figure) consists of two embedded IBM PowerPC 405 processor cores and a comparator built on a Xilinx Virtex-II Pro FPGA. This relatively simple instantiation of the architecture implements an error-detection scheme. A planned future version, incorporating four processors and two comparators, would correct some errors in addition to detecting them.

  18. Additional extensions to the NASCAP computer code, volume 3

    Science.gov (United States)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  19. Additively Manufactured Ceramic Rocket Engine Components

    Data.gov (United States)

    National Aeronautics and Space Administration — HRL Laboratories, LLC, with Vector Space Systems (VSS) as subcontractor, has a 24-month effort to develop additive manufacturing technology for reinforced ceramic...

  20. Integrated Computational Material Engineering Technologies for Additive Manufacturing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — QuesTek Innovations, a pioneer in Integrated Computational Materials Engineering (ICME) and a Tibbetts Award recipient, is teaming with University of Pittsburgh,...

  1. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Limits on fundamental limits to computation.

    Science.gov (United States)

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  3. Human Capital: Additional Actions Needed to Enhance DOD’s Efforts to Address Mental Health Care Stigma

    Science.gov (United States)

    2016-04-01

    Civilians (n=21) 6 5 8 2 h) Substance abuse (alcohol or drugs ) Servicemembers (n= 186) 43 53 82 8 Civilians (n=22) 9 8 4 1 Source: GAO | GAO...Efforts to Address Mental Health Care Stigma Why GAO Did This Study A 2010 DOD task force on suicide prevention concluded that stigma—the negative...Representatives A 2010 Department of Defense (DOD) Task Force on the Prevention of Suicide by Members of the Armed Forces concluded that

  4. Computational Fluid Dynamics (CFD-Based Droplet Size Estimates in Emulsification Equipment

    Directory of Open Access Journals (Sweden)

    Jo Janssen

    2016-12-01

    Full Text Available While academic literature shows steady progress in combining multi-phase computational fluid dynamics (CFD and population balance modelling (PBM of emulsification processes, the computational burden of this approach is still too large for routine use in industry. The challenge, thus, is to link a sufficiently detailed flow analysis to the droplet behavior in a way that is both physically relevant and computationally manageable. In this research article we propose the use of single-phase CFD to map out the local maximum stable droplet diameter within a given device, based on well-known academic droplet break-up studies in quasi-steady 2D linear flows. The results of the latter are represented by analytical correlations for the critical capillary number, which are valid across a wide viscosity ratio range. Additionally, we suggest a parameter to assess how good the assumption of quasi-steady 2D flow is locally. The approach is demonstrated for a common lab-scale rotor-stator device (Ultra-Turrax, IKA-Werke GmbH, Staufen, Germany. It is found to provide useful insights with minimal additional user coding and little increase in computational effort compared to the single-phase CFD simulations of the flow field, as such. Some suggestions for further development are briefly discussed.

  5. Effortful echolalia.

    Science.gov (United States)

    Hadano, K; Nakamura, H; Hamanaka, T

    1998-02-01

    We report three cases of effortful echolalia in patients with cerebral infarction. The clinical picture of speech disturbance is associated with Type 1 Transcortical Motor Aphasia (TCMA, Goldstein, 1915). The patients always spoke nonfluently with loss of speech initiative, dysarthria, dysprosody, agrammatism, and increased effort and were unable to repeat sentences longer than those containing four or six words. In conversation, they first repeated a few words spoken to them, and then produced self initiated speech. The initial repetition as well as the subsequent self initiated speech, which were realized equally laboriously, can be regarded as mitigated echolalia (Pick, 1924). They were always aware of their own echolalia and tried to control it without effect. These cases demonstrate that neither the ability to repeat nor fluent speech are always necessary for echolalia. The possibility that a lesion in the left medial frontal lobe, including the supplementary motor area, plays an important role in effortful echolalia is discussed.

  6. Multiproduct Multiperiod Newsvendor Problem with Dynamic Market Efforts

    Directory of Open Access Journals (Sweden)

    Jianmai Shi

    2016-01-01

    Full Text Available We study a multiperiod multiproduct production planning problem where the production capacity and the marketing effort on demand are both considered. The accumulative impact of marketing effort on demand is captured by the Nerlove and Arrow (N-A advertising model. The problem is formulated as a discrete-time, finite-horizon dynamic optimization problem, which can be viewed as an extension to the classic newsvendor problem by integrating with the N-A model. A Lagrangian relaxation based solution approach is developed to solve the problem, in which the subgradient algorithm is used to find an upper bound of the solution and a feasibility heuristic algorithm is proposed to search for a feasible lower bound. Twelve kinds of instances with different problem size involving up to 50 products and 15 planning periods are randomly generated and used to test the Lagrangian heuristic algorithm. Computational results show that the proposed approach can obtain near optimal solutions for all the instances in very short CPU time, which is less than 90 seconds even for the largest instance.

  7. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    Directory of Open Access Journals (Sweden)

    Andreas Gansäuer

    2013-08-01

    Full Text Available The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG‡ and ΔGR are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically.

  8. Present state of the SOURCES computer code

    International Nuclear Information System (INIS)

    Shores, Erik F.

    2002-01-01

    In various stages of development for over two decades, the SOURCES computer code continues to calculate neutron production rates and spectra from four types of problems: homogeneous media, two-region interfaces, three-region interfaces and that of a monoenergetic alpha particle beam incident on a slab of target material. Graduate work at the University of Missouri - Rolla, in addition to user feedback from a tutorial course, provided the impetus for a variety of code improvements. Recently upgraded to version 4B, initial modifications to SOURCES focused on updates to the 'tape5' decay data library. Shortly thereafter, efforts focused on development of a graphical user interface for the code. This paper documents the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) and describes additional library modifications in more detail. Minor improvements and planned enhancements are discussed.

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Analysis of dismantling possibility and unloading efforts of fuel assemblies from core of WWER

    International Nuclear Information System (INIS)

    Danilov, V.; Dobrov, V.; Semishkin, V.; Vasilchenko, I.

    2006-01-01

    The computation methods of optimal dismantling sequence of fuel assemblies (FA) from core of WWER after different operating periods and accident conditions are considered. The algorithms of fuel dismantling sequence are constructed both on the basis of analysis of mutual spacer grid overlaps of adjacent fuel assemblies and numerical structure analysis of efforts required for FA removal as FA heaving from the core. Computation results for core dismantling sequence after 3-year operating period and LB LOCA are presented in the paper

  12. Brain and effort: brain activation and effort-related working memory in healthy participants and patients with working memory deficits

    Directory of Open Access Journals (Sweden)

    Maria eEngstrom

    2013-04-01

    Full Text Available Despite the interest in the neuroimaging of working memory, little is still known about the neurobiology of complex working memory in tasks that require simultaneous manipulation and storage of information. In addition to the central executive network, we assumed that the recently described salience network (involving the anterior insular cortex and the anterior cingulate cortex might be of particular importance to working memory tasks that require complex, effortful processing. Method: Healthy participants (n=26 and participants suffering from working memory problems related to the Kleine-Levin syndrome (a specific form of periodic idiopathic hypersomnia; n=18 participated in the study. Participants were further divided into a high and low capacity group, according to performance on a working memory task (listening span. In a functional Magnetic Resonance Imaging (fMRI study, participants were administered the reading span complex working memory task tapping cognitive effort. Principal findings: The fMRI-derived blood oxygen level dependent (BOLD signal was modulated by 1 effort in both the central executive and the salience network and 2 capacity in the salience network in that high performers evidenced a weaker BOLD signal than low performers. In the salience network there was a dichotomy between the left and the right hemisphere; the right hemisphere elicited a steeper increase of the BOLD signal as a function of increasing effort. There was also a stronger functional connectivity within the central executive network because of increased task difficulty. Conclusion: The ability to allocate cognitive effort in complex working memory is contingent upon focused resources in the executive and in particular the salience network. Individual capacity during the complex working memory task is related to activity in the salience (but not the executive network so that high-capacity participants evidence a lower signal and possibly hence a larger

  13. Teaching Computational Thinking: Deciding to Take Small Steps in a Curriculum

    Science.gov (United States)

    Madoff, R. D.; Putkonen, J.

    2016-12-01

    While computational thinking and reasoning are not necessarily the same as computer programming, programs such as MATLAB can provide the medium through which the logical and computational thinking at the foundation of science can be taught, learned, and experienced. And while math and computer anxiety are often discussed as critical obstacles to students' progress in their geoscience curriculum, it is here suggested that an unfamiliarity with the computational and logical reasoning is what poses a first stumbling block, in addition to the hurdle of expending the effort to learn how to translate a computational problem into the appropriate computer syntax in order to achieve the intended results. Because computational thinking is so vital for all fields, there is a need to initiate many and to build support in the curriculum for it. This presentation focuses on elements to bring into the teaching of computational thinking that are intended as additions to learning MATLAB programming as a basic tool. Such elements include: highlighting a key concept, discussing a basic geoscience problem where the concept would show up, having the student draw or outline a sketch of what they think an operation needs to do in order to perform a desired result, and then finding the relevant syntax to work with. This iterative pedagogy simulates what someone with more experience in programming does, so it discloses the thinking process in the black box of a result. Intended as only a very early stage introduction, advanced applications would need to be developed as students go through an academic program. The objective would be to expose and introduce computational thinking to majors and non-majors and to alleviate some of the math and computer anxiety so that students would choose to advance on with programming or modeling, whether it is built into a 4-year curriculum or not.

  14. Standards for digital computers used in non-safety nuclear power plant applications: objectives and limitations

    International Nuclear Information System (INIS)

    Rorer, D.C.; Long, A.B.

    1977-01-01

    There are currently a number of efforts to develop standards which would apply to digital computers used in nuclear power plants for functions other than those directly involving plant protection (for example, ANS: 4.3.3 Subworking Group in the U.S., IEC 45A/WGA1 Subcommittee in Europe). The impetus for this activity is discussed and generally attributed to the realization that nonsafety systems computers may affect the assumptions used as the design bases for safety systems, the sizable economic loss which can result from the failure of a critical computer application, and the lingering concern about the misapplication of a still-new technology. At the same time, it is pointed out that these standards may create additional obstacles for the use of this new technology which are not present in the application of more conventional instrumentation and control equipment. Much of the U.S. effort has been directed toward the problem of validation of computer systems in which the potential exists for unplanned interactions between various functions in a multiprogram environment, using common hardware in a time-sharing mode. The goal is to develop procedures for the specification, development implementation, and documentation of testable, modular systems which, in the absence of proven quantitative techniques for assessing software reliability, are felt to provide reasonable assurance that the computer system will function as planned

  15. A community effort to protect genomic data sharing, collaboration and outsourcing.

    Science.gov (United States)

    Wang, Shuang; Jiang, Xiaoqian; Tang, Haixu; Wang, Xiaofeng; Bu, Diyue; Carey, Knox; Dyke, Stephanie Om; Fox, Dov; Jiang, Chao; Lauter, Kristin; Malin, Bradley; Sofia, Heidi; Telenti, Amalio; Wang, Lei; Wang, Wenhao; Ohno-Machado, Lucila

    2017-01-01

    The human genome can reveal sensitive information and is potentially re-identifiable, which raises privacy and security concerns about sharing such data on wide scales. In 2016, we organized the third Critical Assessment of Data Privacy and Protection competition as a community effort to bring together biomedical informaticists, computer privacy and security researchers, and scholars in ethical, legal, and social implications (ELSI) to assess the latest advances on privacy-preserving techniques for protecting human genomic data. Teams were asked to develop novel protection methods for emerging genome privacy challenges in three scenarios: Track (1) data sharing through the Beacon service of the Global Alliance for Genomics and Health. Track (2) collaborative discovery of similar genomes between two institutions; and Track (3) data outsourcing to public cloud services. The latter two tracks represent continuing themes from our 2015 competition, while the former was new and a response to a recently established vulnerability. The winning strategy for Track 1 mitigated the privacy risk by hiding approximately 11% of the variation in the database while permitting around 160,000 queries, a significant improvement over the baseline. The winning strategies in Tracks 2 and 3 showed significant progress over the previous competition by achieving multiple orders of magnitude performance improvement in terms of computational runtime and memory requirements. The outcomes suggest that applying highly optimized privacy-preserving and secure computation techniques to safeguard genomic data sharing and analysis is useful. However, the results also indicate that further efforts are needed to refine these techniques into practical solutions.

  16. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  17. Maximum effort in the minimum-effort game

    Czech Academy of Sciences Publication Activity Database

    Engelmann, Dirk; Normann, H.-T.

    2010-01-01

    Roč. 13, č. 3 (2010), s. 249-259 ISSN 1386-4157 Institutional research plan: CEZ:AV0Z70850503 Keywords : minimum-effort game * coordination game * experiments * social capital Subject RIV: AH - Economics Impact factor: 1.868, year: 2010

  18. Adverse health effects of high-effort/low-reward conditions.

    Science.gov (United States)

    Siegrist, J

    1996-01-01

    In addition to the person-environment fit model (J. R. French, R. D. Caplan, & R. V. Harrison, 1982) and the demand-control model (R. A. Karasek & T. Theorell, 1990), a third theoretical concept is proposed to assess adverse health effects of stressful experience at work: the effort-reward imbalance model. The focus of this model is on reciprocity of exchange in occupational life where high-cost/low-gain conditions are considered particularly stressful. Variables measuring low reward in terms of low status control (e.g., lack of promotion prospects, job insecurity) in association with high extrinsic (e.g., work pressure) or intrinsic (personal coping pattern, e.g., high need for control) effort independently predict new cardiovascular events in a prospective study on blue-collar men. Furthermore, these variables partly explain prevalence of cardiovascular risk factors (hypertension, atherogenic lipids) in 2 independent studies. Studying adverse health effects of high-effort/low-reward conditions seems well justified, especially in view of recent developments of the labor market.

  19. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  20. Achieving high performance in numerical computations on RISC workstations and parallel systems

    Energy Technology Data Exchange (ETDEWEB)

    Goedecker, S. [Max-Planck Inst. for Solid State Research, Stuttgart (Germany); Hoisie, A. [Los Alamos National Lab., NM (United States)

    1997-08-20

    The nominal peak speeds of both serial and parallel computers is raising rapidly. At the same time however it is becoming increasingly difficult to get out a significant fraction of this high peak speed from modern computer architectures. In this tutorial the authors give the scientists and engineers involved in numerically demanding calculations and simulations the necessary basic knowledge to write reasonably efficient programs. The basic principles are rather simple and the possible rewards large. Writing a program by taking into account optimization techniques related to the computer architecture can significantly speedup your program, often by factors of 10--100. As such, optimizing a program can for instance be a much better solution than buying a faster computer. If a few basic optimization principles are applied during program development, the additional time needed for obtaining an efficient program is practically negligible. In-depth optimization is usually only needed for a few subroutines or kernels and the effort involved is therefore also acceptable.

  1. PERSPECTIVES FOR FOG COMPUTING IN MANUFACTURING

    Directory of Open Access Journals (Sweden)

    Jakub PIZOŃ

    2016-09-01

    Full Text Available This article discusses ongoing efforts to enable the fog computing vision in manufacturing. As a new paradigm of computing implementation of fog computing faces many challenges that open perspective of new applications within a field of manufacturing. It is expected that fog computing will be one of factors that will accelerate development of in forth industrial revolution. In this article we discuss the perspectives of manufacturing companies surrounded by new solutions of CPS, CPPS and CM in relation to fog computing.

  2. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  3. Neurodynamic evaluation of hearing aid features using EEG correlates of listening effort.

    Science.gov (United States)

    Bernarding, Corinna; Strauss, Daniel J; Hannemann, Ronny; Seidler, Harald; Corona-Strauss, Farah I

    2017-06-01

    In this study, we propose a novel estimate of listening effort using electroencephalographic data. This method is a translation of our past findings, gained from the evoked electroencephalographic activity, to the oscillatory EEG activity. To test this technique, electroencephalographic data from experienced hearing aid users with moderate hearing loss were recorded, wearing hearing aids. The investigated hearing aid settings were: a directional microphone combined with a noise reduction algorithm in a medium and a strong setting, the noise reduction setting turned off, and a setting using omnidirectional microphones without any noise reduction. The results suggest that the electroencephalographic estimate of listening effort seems to be a useful tool to map the exerted effort of the participants. In addition, the results indicate that a directional processing mode can reduce the listening effort in multitalker listening situations.

  4. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  5. DC Control Effort Minimized for Magnetic-Bearing-Supported Shaft

    Science.gov (United States)

    Brown, Gerald V.

    2001-01-01

    A magnetic-bearing-supported shaft may have a number of concentricity and alignment problems. One of these involves the relationship of the position sensors, the centerline of the backup bearings, and the magnetic center of the magnetic bearings. For magnetic bearings with permanent magnet biasing, the average control current for a given control axis that is not bearing the shaft weight will be minimized if the shaft is centered, on average over a revolution, at the magnetic center of the bearings. That position may not yield zero sensor output or center the shaft in the backup bearing clearance. The desired shaft position that gives zero average current can be achieved if a simple additional term is added to the control law. Suppose that the instantaneous control currents from each bearing are available from measurements and can be input into the control computer. If each control current is integrated with a very small rate of accumulation and the result is added to the control output, the shaft will gradually move to a position where the control current averages to zero over many revolutions. This will occur regardless of any offsets of the position sensor inputs. At that position, the average control effort is minimized in comparison to other possible locations of the shaft. Nonlinearities of the magnetic bearing are minimized at that location as well.

  6. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  7. Cognitive capacity limitations and Need for Cognition differentially predict reward-induced cognitive effort expenditure.

    Science.gov (United States)

    Sandra, Dasha A; Otto, A Ross

    2018-03-01

    While psychological, economic, and neuroscientific accounts of behavior broadly maintain that people minimize expenditure of cognitive effort, empirical work reveals how reward incentives can mobilize increased cognitive effort expenditure. Recent theories posit that the decision to expend effort is governed, in part, by a cost-benefit tradeoff whereby the potential benefits of mental effort can offset the perceived costs of effort exertion. Taking an individual differences approach, the present study examined whether one's executive function capacity, as measured by Stroop interference, predicts the extent to which reward incentives reduce switch costs in a task-switching paradigm, which indexes additional expenditure of cognitive effort. In accordance with the predictions of a cost-benefit account of effort, we found that a low executive function capacity-and, relatedly, a low intrinsic motivation to expend effort (measured by Need for Cognition)-predicted larger increase in cognitive effort expenditure in response to monetary reward incentives, while individuals with greater executive function capacity-and greater intrinsic motivation to expend effort-were less responsive to reward incentives. These findings suggest that an individual's cost-benefit tradeoff is constrained by the perceived costs of exerting cognitive effort. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Large Scale Computing and Storage Requirements for High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years

  9. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  10. A new generation in computing

    International Nuclear Information System (INIS)

    Kahn, R.E.

    1983-01-01

    Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed

  11. Trust Trust Me (The Additivity)

    OpenAIRE

    Mano , Ken; Sakurada , Hideki; Tsukada , Yasuyuki

    2017-01-01

    Part 4: Trust Metrics; International audience; We present a mathematical formulation of a trust metric using a quality and quantity pair. Under a certain assumption, we regard trust as an additive value and define the soundness of a trust computation as not to exceed the total sum. Moreover, we point out the importance of not only soundness of each computed trust but also the stability of the trust computation procedure against changes in trust value assignment. In this setting, we define tru...

  12. Respiratory effort from the photoplethysmogram.

    Science.gov (United States)

    Addison, Paul S

    2017-03-01

    The potential for a simple, non-invasive measure of respiratory effort based on the pulse oximeter signal - the photoplethysmogram or 'pleth' - was investigated in a pilot study. Several parameters were developed based on a variety of manifestations of respiratory effort in the signal, including modulation changes in amplitude, baseline, frequency and pulse transit times, as well as distinct baseline signal shifts. Thirteen candidate parameters were investigated using data from healthy volunteers. Each volunteer underwent a series of controlled respiratory effort maneuvers at various set flow resistances and respiratory rates. Six oximeter probes were tested at various body sites. In all, over three thousand pleth-based effort-airway pressure (EP) curves were generated across the various airway constrictions, respiratory efforts, respiratory rates, subjects, probe sites, and the candidate parameters considered. Regression analysis was performed to determine the existence of positive monotonic relationships between the respiratory effort parameters and resulting airway pressures. Six of the candidate parameters investigated exhibited a distinct positive relationship (poximeter probe and an ECG (P2E-Effort) and the other using two pulse oximeter probes placed at different peripheral body sites (P2-Effort); and baseline shifts in heart rate, (BL-HR-Effort). In conclusion, a clear monotonic relationship was found between several pleth-based parameters and imposed respiratory loadings at the mouth across a range of respiratory rates and flow constrictions. The results suggest that the pleth may provide a measure of changing upper airway dynamics indicative of the effort to breathe. Copyright © 2017 The Author. Published by Elsevier Ltd.. All rights reserved.

  13. [Effort-reward imbalance at work and depression: current research evidence].

    Science.gov (United States)

    Siegrist, J

    2013-01-01

    In view of highly prevalent stressful conditions in modern working life, in particular increasing work pressure and job insecurity, it is of interest to know whether specific constellations of an adverse psychosocial work environment increase the risk of depressive disorder among employed people. This contribution gives a short overview of current research evidence based on an internationally established work stress model of effort-reward imbalance. Taken together, results from seven prospective epidemiological investigations demonstrate a two-fold elevated relative risk of incident depressive disorder over a mean observation period of 2.7 years among exposed versus non-exposed employees. Additional findings from experimental and quasi-experimental studies point to robust associations of effort-reward imbalance at work with proinflammatory cytokines and markers of reduced immune competence. These latter markers may indicate potential psychobiological pathways. In conclusion, incorporating this new knowledge into medical treatment and preventive efforts seems well justified.

  14. The influence of the speed of the down-ward leader channel in computation of additional charge for protection against direct lightning strike by charge transfer system in 'ultra-corona' mode

    International Nuclear Information System (INIS)

    Talevski, V.

    2012-01-01

    In this paper computation of additional charge is done for protection against direct lightning strike, by charge transfer system, by point electrode, in 'ultra-corona' mode. The influence of the voltage increase in a very small time interval is computed and the influence is taken into consideration in the computation of the additional space charge on the object used for protection. The model of the electrical thundercloud is taken into consideration with all the electrical charge in it with its corresponding heights above ground. Plotted values are presented of the speed of the down-ward leader from the cloud versus the additional space charge, needed to be placed on the top of the object protected by direct lightning. Plotted values are also presented of different position of the horizontal distance of the protected object and its height versus the additional space charge. (Authors)

  15. [Trends in the utilization of food additives].

    Science.gov (United States)

    Szűcs, Viktória; Bánáti, Diána

    2013-11-17

    The frequent media reports on food additives weakened consumers' trust in food producers and food control authorities as well. Furthermore, consumers' uncertainty is also raised by the fact that they obtain their information from inadequate, mistrustful sources and, therefore, consumers might avoid the consumption of certain foodstuffs. While food producers may react by replacing artificial components by natural ones, they try to emphasize the favourable characteristics of their products. The authors describe the main trends and efforts related to food additives. On the basis of the overview it can be concluded that - besides taking into consideration consumers' needs - product development and research directions are promising. Food producers' efforts may help to restore consumer confidence and trust and they may help them to have informed choice.

  16. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  17. Cognitive effort: A neuroeconomic approach

    Science.gov (United States)

    Braver, Todd S.

    2015-01-01

    Cognitive effort has been implicated in numerous theories regarding normal and aberrant behavior and the physiological response to engagement with demanding tasks. Yet, despite broad interest, no unifying, operational definition of cognitive effort itself has been proposed. Here, we argue that the most intuitive and epistemologically valuable treatment is in terms of effort-based decision-making, and advocate a neuroeconomics-focused research strategy. We first outline psychological and neuroscientific theories of cognitive effort. Then we describe the benefits of a neuroeconomic research strategy, highlighting how it affords greater inferential traction than do traditional markers of cognitive effort, including self-reports and physiologic markers of autonomic arousal. Finally, we sketch a future series of studies that can leverage the full potential of the neuroeconomic approach toward understanding the cognitive and neural mechanisms that give rise to phenomenal, subjective cognitive effort. PMID:25673005

  18. Artificial Neural Networks for Reducing Computational Effort in Active Truncated Model Testing of Mooring Lines

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye; Voie, Per Erlend Torbergsen; Høgsberg, Jan Becker

    2015-01-01

    simultaneously, this method is very demanding in terms of numerical efficiency and computational power. Therefore, this method has not yet proved to be feasible. It has recently been shown how a hybrid method combining classical numerical models and artificial neural networks (ANN) can provide a dramatic...... prior to the experiment and with a properly trained ANN it is no problem to obtain accurate simulations much faster than real time-without any need for large computational capacity. The present study demonstrates how this hybrid method can be applied to the active truncated experiments yielding a system...

  19. Physics, Computer Science and Mathematics Division. Annual report, 1 January-31 December 1979

    International Nuclear Information System (INIS)

    Lepore, J.V.

    1980-09-01

    This annual report describes the research work carried out by the Physics, Computer Science and Mathematics Division during 1979. The major research effort of the Division remained High Energy Particle Physics with emphasis on preparing for experiments to be carried out at PEP. The largest effort in this field was for development and construction of the Time Projection Chamber, a powerful new particle detector. This work took a large fraction of the effort of the physics staff of the Division together with the equivalent of more than a hundred staff members in the Engineering Departments and shops. Research in the Computer Science and Mathematics Department of the Division (CSAM) has been rapidly expanding during the last few years. Cross fertilization of ideas and talents resulting from the diversity of effort in the Physics, Computer Science and Mathematics Division contributed to the software design for the Time Projection Chamber, made by the Computer Science and Applied Mathematics Department

  20. Software enhancements and modifications to Program FDTD executable on the Cray X-MP computer

    Energy Technology Data Exchange (ETDEWEB)

    Stringer, J.C.

    1987-09-04

    This report summarizes enhancements and modifications to PROGRAM FDTD executable on the Cray X-MP computer system. Specifically, the tasks defined and performed under this effort are revision of the material encoding/decoding scheme to allow material type specification on an individual cell basis; modification of the I/O buffering scheme to maximize the use of available central memory and minimize the number of physical I/O accesses; user interface enhancements. Provide enhanced input/output features for greater flexibility; increased modularity. Divide the code into additional modules for ease of maintenance and future enhancements; and assist in the conversion and testing of FDTD to Floating Point Systems scientific computers and associated peripheral devices.

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  4. Adsorption of molecular additive onto lead halide perovskite surfaces: A computational study on Lewis base thiophene additive passivation

    Science.gov (United States)

    Zhang, Lei; Yu, Fengxi; Chen, Lihong; Li, Jingfa

    2018-06-01

    Organic additives, such as the Lewis base thiophene, have been successfully applied to passivate halide perovskite surfaces, improving the stability and properties of perovskite devices based on CH3NH3PbI3. Yet, the detailed nanostructure of the perovskite surface passivated by additives and the mechanisms of such passivation are not well understood. This study presents a nanoscopic view on the interfacial structure of an additive/perovskite interface, consisting of a Lewis base thiophene molecular additive and a lead halide perovskite surface substrate, providing insights on the mechanisms that molecular additives can passivate the halide perovskite surfaces and enhance the perovskite-based device performance. Molecular dynamics study on the interactions between water molecules and the perovskite surfaces passivated by the investigated additive reveal the effectiveness of employing the molecular additives to improve the stability of the halide perovskite materials. The additive/perovskite surface system is further probed via molecular engineering the perovskite surfaces. This study reveals the nanoscopic structure-property relationships of the halide perovskite surface passivated by molecular additives, which helps the fundamental understanding of the surface/interface engineering strategies for the development of halide perovskite based devices.

  5. Effort rights-based management

    DEFF Research Database (Denmark)

    Squires, Dale; Maunder, Mark; Allen, Robin

    2017-01-01

    Effort rights-based fisheries management (RBM) is less widely used than catch rights, whether for groups or individuals. Because RBM on catch or effort necessarily requires a total allowable catch (TAC) or total allowable effort (TAE), RBM is discussed in conjunction with issues in assessing fish...... populations and providing TACs or TAEs. Both approaches have advantages and disadvantages, and there are trade-offs between the two approaches. In a narrow economic sense, catch rights are superior because of the type of incentives created, but once the costs of research to improve stock assessments...

  6. Applications of parallel computer architectures to the real-time simulation of nuclear power systems

    International Nuclear Information System (INIS)

    Doster, J.M.; Sills, E.D.

    1988-01-01

    In this paper the authors report on efforts to utilize parallel computer architectures for the thermal-hydraulic simulation of nuclear power systems and current research efforts toward the development of advanced reactor operator aids and control systems based on this new technology. Many aspects of reactor thermal-hydraulic calculations are inherently parallel, and the computationally intensive portions of these calculations can be effectively implemented on modern computers. Timing studies indicate faster-than-real-time, high-fidelity physics models can be developed when the computational algorithms are designed to take advantage of the computer's architecture. These capabilities allow for the development of novel control systems and advanced reactor operator aids. Coupled with an integral real-time data acquisition system, evolving parallel computer architectures can provide operators and control room designers improved control and protection capabilities. Current research efforts are currently under way in this area

  7. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: the ONSITE/MAXI1 computer program. Supplement No. 1

    International Nuclear Information System (INIS)

    Kennedy, W.E. Jr.; Peloquin, R.A.; Napier, B.A.; Neuder, S.M.

    1986-05-01

    The document entitled Intruder Dose Pathway Analysis of the Onsite Disposal of Radioactive Wastes: The ONSITE/MAXI1 Computer Program (1984) summarizes initial efforts to develop human-intrustion scenarios and a modified version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. This document is a supplement to that document and summarizes efforts to further modify and improve the ONSITE/MAXI1 software package. To facilitate cross-referencing, it follows the same format. Notable improvements to the software package include the capability to account for shielding conditions that represent noncompacted trash wastes and the option to indicate alternative land-use condition;s. This supplement contains a description of the implementation of these modifications. In addition, a series of discussions are included in an attempt to increase the user's understanding of the scenarios and dose calculation methods. These discussions respond to frequently asked questions about the mathematical models and use of the software. Computer listings of the ONSITE/MAXI1 computer program are included as Appendices A and B of this document. Appendix C lists external exposure dose-rate factor libraries

  8. Shor's factoring algorithm and modern cryptography. An illustration of the capabilities inherent in quantum computers

    Science.gov (United States)

    Gerjuoy, Edward

    2005-06-01

    The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.

  9. The influence of music on mental effort and driving performance.

    Science.gov (United States)

    Ünal, Ayça Berfu; Steg, Linda; Epstude, Kai

    2012-09-01

    The current research examined the influence of loud music on driving performance, and whether mental effort mediated this effect. Participants (N=69) drove in a driving simulator either with or without listening to music. In order to test whether music would have similar effects on driving performance in different situations, we manipulated the simulated traffic environment such that the driving context consisted of both complex and monotonous driving situations. In addition, we systematically kept track of drivers' mental load by making the participants verbally report their mental effort at certain moments while driving. We found that listening to music increased mental effort while driving, irrespective of the driving situation being complex or monotonous, providing support to the general assumption that music can be a distracting auditory stimulus while driving. However, drivers who listened to music performed as well as the drivers who did not listen to music, indicating that music did not impair their driving performance. Importantly, the increases in mental effort while listening to music pointed out that drivers try to regulate their mental effort as a cognitive compensatory strategy to deal with task demands. Interestingly, we observed significant improvements in driving performance in two of the driving situations. It seems like mental effort might mediate the effect of music on driving performance in situations requiring sustained attention. Other process variables, such as arousal and boredom, should also be incorporated to study designs in order to reveal more on the nature of how music affects driving. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Computation of Unsteady Flow in Flame Trench For Prediction of Ignition Overpressure Waves

    Science.gov (United States)

    Kwak, Dochan; Kris, Cetin

    2010-01-01

    Computational processes/issues for supporting mission tasks are discussed using an example from launch environment simulation. Entire CFD process has been discussed using an existing code; STS-124 conditions were revisited to support wall repair effort for STS-125 flight; when water bags were not included, computed results indicate that IOP waves with the peak values have been reflected from SRB s own exhaust hole; ARES-1X simulations show that there is a shock wave going through the unused exhaust hole, however, it plays a secondary role; all three ARES-1X cases and STS-1 simulations showed very similar IOP magnitudes and patters on the vehicle; with the addition of water bags and water injection, it will further diminish the IOP effects.

  11. Effort-Based Decision Making: A Novel Approach for Assessing Motivation in Schizophrenia.

    Science.gov (United States)

    Green, Michael F; Horan, William P; Barch, Deanna M; Gold, James M

    2015-09-01

    Because negative symptoms, including motivational deficits, are a critical unmet need in schizophrenia, there are many ongoing efforts to develop new pharmacological and psychosocial interventions for these impairments. A common challenge of these studies involves how to evaluate and select optimal endpoints. Currently, all studies of negative symptoms in schizophrenia depend on ratings from clinician-conducted interviews. Effort-based decision-making tasks may provide a more objective, and perhaps more sensitive, endpoint for trials of motivational negative symptoms. These tasks assess how much effort a person is willing to exert for a given level of reward. This area has been well-studied with animal models of effort and motivation, and effort-based decision-making tasks have been adapted for use in humans. Very recently, several studies have examined physical and cognitive types of effort-based decision-making tasks in cross-sectional studies of schizophrenia, providing evidence for effort-related impairment in this illness. This article covers the theoretical background on effort-based decision-making tasks to provide a context for the subsequent articles in this theme section. In addition, we review the existing literature of studies using these tasks in schizophrenia, consider some practical challenges in adapting them for use in clinical trials in schizophrenia, and discuss interpretive challenges that are central to these types of tasks. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  12. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  13. 2016 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  14. Stakeholder interactions to support service creation in cloud computing

    NARCIS (Netherlands)

    Wang, Lei; Ferreira Pires, Luis; Wombacher, Andreas; van Sinderen, Marten J.; Chi, Chihung

    2010-01-01

    Cloud computing is already a major trend in IT. Cloud services are being offered at application (software), platform and infrastructure levels. This paper presents our initial modeling efforts towards service creation at the infrastructure level. The purpose of these modeling efforts is to

  15. Missed deadline notification in best-effort schedulers

    Science.gov (United States)

    Banachowski, Scott A.; Wu, Joel; Brandt, Scott A.

    2003-12-01

    It is common to run multimedia and other periodic, soft real-time applications on general-purpose computer systems. These systems use best-effort scheduling algorithms that cannot guarantee applications will receive responsive scheduling to meet deadline or timing requirements. We present a simple mechanism called Missed Deadline Notification (MDN) that allows applications to notify the system when they do not receive their desired level of responsiveness. Consisting of a single system call with no arguments, this simple interface allows the operating system to provide better support for soft real-time applications without any a priori information about their timing or resource needs. We implemented MDN in three different schedulers: Linux, BEST, and BeRate. We describe these implementations and their performance when running real-time applications and discuss policies to prevent applications from abusing MDN to gain extra resources.

  16. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  17. Stimulus-response compatibility and affective computing: A review

    NARCIS (Netherlands)

    Lemmens, P.M.C.; Haan, A. de; Galen, G.P. van; Meulenbroek, R.G.J.

    2007-01-01

    Affective computing, a human–factors effort to investigate the merits of emotions while people are working with human–computer interfaces, is gaining momentum. Measures to quantify affect (or its influences) range from EEG, to measurements of autonomic–nervous–system responses (e.g., heart rate,

  18. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  19. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  20. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  1. The Computer Backgrounds of Soldiers in Army Units: FY01

    National Research Council Canada - National Science Library

    Singh, Harnam

    2002-01-01

    A multi-year research effort was instituted in FY99 to examine soldiers' experiences with computers, self- perceptions of their computer skill, and their ability to identify frequently used, Windows-based icons...

  2. Designing with computers at Lawrence Berkeley Laboratory

    International Nuclear Information System (INIS)

    Colonas, J.S.

    1974-10-01

    The application of digital computers to the solution of engineering problems relating to accelerator design was explored. The existing computer hardware and software available for direct communication between the engineer and the computer are described, and some examples of useful programs are outlined, showing the ease of their use and the method of communication between machine and designer. An effort is made to convince engineers that they can communicate with the computer in ordinary English and mathematics, rather than in intermediate artificial languages. (U.S.)

  3. Office workers' computer use patterns are associated with workplace stressors.

    Science.gov (United States)

    Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J

    2014-11-01

    This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  5. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    Science.gov (United States)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  6. Effort-Reward Imbalance and Burnout Among ICU Nursing Staff: A Cross-Sectional Study.

    Science.gov (United States)

    Padilla Fortunatti, Cristobal; Palmeiro-Silva, Yasna K

    Occupational stress is commonly observed among staff in intensive care units (ICUs). Sociodemographic, organizational, and job-related factors may lead to burnout among ICU health workers. In addition, these factors could modify the balance between efforts done and rewards perceived by workers; consequently, this imbalance could increase levels of emotional exhaustion and depersonalization and decrease a sense of personal accomplishment. The purpose of this study was to analyze the relationship between effort-reward imbalance and burnout dimensions (emotional exhaustion, depersonalization, and personal accomplishment) among ICU nursing staff in a university hospital in Santiago, Chile. A convenience sample of 36 registered nurses and 46 nurse aides answered the Maslach Burnout Inventory and Effort-Reward Imbalance Questionnaire and provided sociodemographic and work-related data. Age and effort-reward imbalance were significantly associated with emotional exhaustion in both registered nurses and nurse aides; age was negatively correlated with emotional exhaustion, whereas effort-reward imbalance was positively correlated. Age was negatively associated with depersonalization. None of the predictors were associated with personal accomplishment. This study adds valuable information about relationships of sociodemographic factors and effort-reward imbalance and their impact on dimensions of burnout, particularly on emotional exhaustion.

  7. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Progress report, July 1993--August 1994

    International Nuclear Information System (INIS)

    Dragt, A.J.; Gluckstern, R.L.

    1994-08-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group has been carrying out long-term research work in the general area of Dynamical Systems with a particular emphasis on applications to Accelerator Physics. This work is broadly divided into two tasks: the computation of charged particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Each of these tasks is described briefly. Work is devoted both to the development of new methods and the application of these methods to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. In addition to its research effort, the Dynamical Systems and Accelerator Theory Group is actively engaged in the education of students and postdoctoral research associates. Substantial progress in research has been made during the past year. These achievements are summarized in the following report

  8. Lattice guage theories on a hypercube computer

    International Nuclear Information System (INIS)

    Otto, S.W.

    1984-01-01

    A report on the parallel computer effort underway at Caltech and the use of these machines for lattice gauge theories is given. The computational requirements of the Monte Carlos are, of course, enormous, so high Mflops (Million floating point operations per second) and large memories are required. Various calculations on the machines in regards to their programmability (a non-trivial issue on a parallel computer) and their efficiency in usage of the machine are discussed

  9. Dopamine and Effort-Based Decision Making

    Directory of Open Access Journals (Sweden)

    Irma Triasih Kurniawan

    2011-06-01

    Full Text Available Motivational theories of choice focus on the influence of goal values and strength of reinforcement to explain behavior. By contrast relatively little is known concerning how the cost of an action, such as effort expended, contributes to a decision to act. Effort-based decision making addresses how we make an action choice based on an integration of action and goal values. Here we review behavioral and neurobiological data regarding the representation of effort as action cost, and how this impacts on decision making. Although organisms expend effort to obtain a desired reward there is a striking sensitivity to the amount of effort required, such that the net preference for an action decreases as effort cost increases. We discuss the contribution of the neurotransmitter dopamine (DA towards overcoming response costs and in enhancing an animal’s motivation towards effortful actions. We also consider the contribution of brain structures, including the basal ganglia (BG and anterior cingulate cortex (ACC, in the internal generation of action involving a translation of reward expectation into effortful action.

  10. Medial Orbitofrontal Cortex Mediates Effort-related Responding in Rats.

    Science.gov (United States)

    Münster, Alexandra; Hauber, Wolfgang

    2017-11-17

    The medial orbitofrontal cortex (mOFC) is known to support flexible control of goal-directed behavior. However, limited evidence suggests that the mOFC also mediates the ability of organisms to work with vigor towards a selected goal, a hypothesis that received little consideration to date. Here we show that excitotoxic mOFC lesion increased responding under a progressive ratio (PR) schedule of reinforcement, that is, the highest ratio achieved, and increased the preference for the high effort-high reward option in an effort-related decision-making task, but left intact outcome-selective Pavlovian-instrumental transfer and outcome-specific devaluation. Moreover, pharmacological inhibition of the mOFC increased, while pharmacological stimulation reduced PR responding. In addition, pharmacological mOFC stimulation attenuated methylphenidate-induced increase of PR responding. Intact rats tested for PR responding displayed higher numbers of c-Fos positive mOFC neurons than appropriate controls; however, mOFC neurons projecting to the nucleus accumbens did not show a selective increase in neuronal activation implying that they may not play a major role in regulating PR responding. Collectively, these results suggest that the mOFC plays a major role in mediating effort-related motivational functions. Moreover, our data demonstrate for the first time that the mOFC modulates effort-related effects of psychostimulant drugs. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Usefulness and Enjoyment of Using Computers in Leaning: A ...

    African Journals Online (AJOL)

    Different educational efforts have been employed to minimize or eliminate gender differences by using various learning means. One of these efforts has been the use of computers in classrooms instruction because of the nature of qualities these facilities possess. This article analyzes students' gender attitudes towards ...

  12. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Kothe, Douglas B [ORNL; Nam, Hai Ah [ORNL

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  13. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    International Nuclear Information System (INIS)

    Joubert, Wayne; Kothe, Douglas B.; Nam, Hai Ah

    2009-01-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  14. Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers

    Science.gov (United States)

    Newman, Thomas R.

    2017-01-01

    The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…

  15. Small Computer Applications for Base Supply.

    Science.gov (United States)

    1984-03-01

    research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign

  16. Active Computer Network Defense: An Assessment

    Science.gov (United States)

    2001-04-01

    sufficient base of knowledge in information technology can be assumed to be working on some form of computer network warfare, even if only defensive in...the Defense Information Infrastructure (DII) to attack. Transmission Control Protocol/ Internet Protocol (TCP/IP) networks are inherently resistant to...aims to create this part of information superiority, and computer network defense is one of its fundamental components. Most of these efforts center

  17. Food packaging cues influence taste perception and increase effort provision for a recommended snack product in children

    Directory of Open Access Journals (Sweden)

    Laura eEnax

    2015-07-01

    Full Text Available Food marketing research shows that child-directed marketing cues have pronounced effects on food preferences and consumption, but are most often placed on products with low nutritional quality. Effects of child-directed marketing strategies for healthy food products remain to be studied in more detail. Previous research suggests that effort provision explains additional variance in food choice. This study investigated the effects of packaging cues on explicit preferences and effort provision for healthy food items in elementary school children. Each of 179 children rated three, objectively identical, recommended yoghurt-cereal-fruit snacks presented with different packaging cues. Packaging cues included a plain label, a label focusing on health aspects of the product, and a label that additionally included unknown cartoon characters. The children were asked to state the subjective taste-pleasantness of the respective food items. We also used a novel approach to measure effort provision for food items in children, namely handgrip strength. Results show that packaging cues significantly induce a taste-placebo effect in 88% of the children, i.e., differences in taste ratings for objectively identical products. Taste ratings were highest for the child-directed product that included cartoon characters. Also, applied effort to receive the child-directed product was significantly higher. Our results confirm the positive effect of child-directed marketing strategies also for healthy snack food products. Using handgrip strength as a measure to determine the amount of effort children are willing to provide for a product may explain additional variance in food choice and might prove to be a promising additional research tool for field studies and the assessment of public policy interventions.

  18. Food packaging cues influence taste perception and increase effort provision for a recommended snack product in children.

    Science.gov (United States)

    Enax, Laura; Weber, Bernd; Ahlers, Maren; Kaiser, Ulrike; Diethelm, Katharina; Holtkamp, Dominik; Faupel, Ulya; Holzmüller, Hartmut H; Kersting, Mathilde

    2015-01-01

    Food marketing research shows that child-directed marketing cues have pronounced effects on food preferences and consumption, but are most often placed on products with low nutritional quality. Effects of child-directed marketing strategies for healthy food products remain to be studied in more detail. Previous research suggests that effort provision explains additional variance in food choice. This study investigated the effects of packaging cues on explicit preferences and effort provision for healthy food items in elementary school children. Each of 179 children rated three, objectively identical, recommended yogurt-cereal-fruit snacks presented with different packaging cues. Packaging cues included a plain label, a label focusing on health aspects of the product, and a label that additionally included unknown cartoon characters. The children were asked to state the subjective taste-pleasantness of the respective food items. We also used a novel approach to measure effort provision for food items in children, namely handgrip strength. Results show that packaging cues significantly induce a taste-placebo effect in 88% of the children, i.e., differences in taste ratings for objectively identical products. Taste ratings were highest for the child-directed product that included cartoon characters. Also, applied effort to receive the child-directed product was significantly higher. Our results confirm the positive effect of child-directed marketing strategies also for healthy snack food products. Using handgrip strength as a measure to determine the amount of effort children are willing to provide for a product may explain additional variance in food choice and might prove to be a promising additional research tool for field studies and the assessment of public policy interventions.

  19. Food packaging cues influence taste perception and increase effort provision for a recommended snack product in children

    Science.gov (United States)

    Enax, Laura; Weber, Bernd; Ahlers, Maren; Kaiser, Ulrike; Diethelm, Katharina; Holtkamp, Dominik; Faupel, Ulya; Holzmüller, Hartmut H.; Kersting, Mathilde

    2015-01-01

    Food marketing research shows that child-directed marketing cues have pronounced effects on food preferences and consumption, but are most often placed on products with low nutritional quality. Effects of child-directed marketing strategies for healthy food products remain to be studied in more detail. Previous research suggests that effort provision explains additional variance in food choice. This study investigated the effects of packaging cues on explicit preferences and effort provision for healthy food items in elementary school children. Each of 179 children rated three, objectively identical, recommended yogurt-cereal-fruit snacks presented with different packaging cues. Packaging cues included a plain label, a label focusing on health aspects of the product, and a label that additionally included unknown cartoon characters. The children were asked to state the subjective taste-pleasantness of the respective food items. We also used a novel approach to measure effort provision for food items in children, namely handgrip strength. Results show that packaging cues significantly induce a taste-placebo effect in 88% of the children, i.e., differences in taste ratings for objectively identical products. Taste ratings were highest for the child-directed product that included cartoon characters. Also, applied effort to receive the child-directed product was significantly higher. Our results confirm the positive effect of child-directed marketing strategies also for healthy snack food products. Using handgrip strength as a measure to determine the amount of effort children are willing to provide for a product may explain additional variance in food choice and might prove to be a promising additional research tool for field studies and the assessment of public policy interventions. PMID:26191012

  20. The Effects of Meaning-Based Auditory Training on Behavioral Measures of Perceptual Effort in Individuals with Impaired Hearing.

    Science.gov (United States)

    Sommers, Mitchell S; Tye-Murray, Nancy; Barcroft, Joe; Spehar, Brent P

    2015-11-01

    There has been considerable interest in measuring the perceptual effort required to understand speech, as well as to identify factors that might reduce such effort. In the current study, we investigated whether, in addition to improving speech intelligibility, auditory training also could reduce perceptual or listening effort. Perceptual effort was assessed using a modified version of the n-back memory task in which participants heard lists of words presented without background noise and were asked to continually update their memory of the three most recently presented words. Perceptual effort was indexed by memory for items in the three-back position immediately before, immediately after, and 3 months after participants completed the Computerized Learning Exercises for Aural Rehabilitation (clEAR), a 12-session computerized auditory training program. Immediate posttraining measures of perceptual effort indicated that participants could remember approximately one additional word compared to pretraining. Moreover, some training gains were retained at the 3-month follow-up, as indicated by significantly greater recall for the three-back item at the 3-month measurement than at pretest. There was a small but significant correlation between gains in intelligibility and gains in perceptual effort. The findings are discussed within the framework of a limited-capacity speech perception system.

  1. Incorporation of personal computers in a research reactor instrumentation system for data monitoring and analysis

    International Nuclear Information System (INIS)

    Leopando, L.S.

    1998-01-01

    The research contract was implemented by obtaining off-the shelf personal computer hardware and data acquisition cards, designing the interconnection with the instrumentation system, writing and debugging the software, and the assembling and testing the set-up. The hardware was designed to allow all variables monitored by the instrumentation system to be accessible to the computers, without requiring any major modification of the instrumentation system and without compromising reactor safety in any way. The computer hardware addition was also designed to have no effect on any existing function of the instrumentation system. The software was designed to implement only graphical display and automated logging of reactor variables. Additional functionality could be easily added in the future with software revision because all the reactor variables are already available in the computer. It would even be possible to ''close the loop'' and control the reactor through software. It was found that most of the effort in an undertaking of this sort will be in software development, but the job can be done even by non-computer specialized reactor people working with programming languages they are already familiar with. It was also found that the continuing rapid advance of personal computer technology makes it essential that such a project be undertaken with inevitability of future hardware upgrading in mind. The hardware techniques and the software developed may find applicability in other research reactors, especially those with a generic analog research reactor TRIGA console. (author)

  2. Computer architecture evaluation for structural dynamics computations: Project summary

    Science.gov (United States)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  3. Allocating effort and anticipating pleasure in schizophrenia: Relationship with real world functioning.

    Science.gov (United States)

    Serper, M; Payne, E; Dill, C; Portillo, C; Taliercio, J

    2017-10-01

    Poor motivation to engage in goal-oriented behavior has been recognized as a hallmark feature of schizophrenia spectrum disorders (SZ). Low drive in SZ may be related to anticipating rewards as well as to poor working memory. However, few studies to date have examined beliefs about self-efficacy and satisfaction for future rewards (anticipatory pleasure). Additionally, few studies to date have examined how these deficits may impact SZ patients' real world functioning. The present study examined SZ patients' (n=57) anticipatory pleasure, working memory, self-efficacy and real world functioning in relation to their negative symptom severity. Results revealed that SZ patients' negative symptom severity was related to decisions in effort allocation and reward probability, working memory deficits, self-efficacy and anticipatory pleasure for future reward. Effort allocation deficits also predicted patients' daily functioning skills. SZ patients with high levels of negative symptoms are not merely effort averse, but have more difficulty effectively allocating effort and anticipating pleasure engaging in effortful activities. It may be the case that continuously failing to achieve reinforcement from engagement and participation may lead SZ patients to form certain negative beliefs about their abilities which contributes to amotivation and cognitive deficits. Lastly, our findings provide further support for a link between SZ patients functional daily living skills their effort allocation. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  4. Effort and accuracy during language resource generation: a pronunciation prediction case study

    CSIR Research Space (South Africa)

    Davel, M

    2006-11-01

    Full Text Available pronunciation dictionary as case study. We show that the amount of effort required to validate a 20,000-word pronunciation dictionary can be reduced sub- stantially by employing appropriate computational tools, when compared to both a fully manual validation... and correcting errors found, and finally, manually verifying a further portion of the resource in order to estimate its current accuracy. We apply this general approach to the task of developing pronunciation dictionaries. We demonstrate how the validation...

  5. Additional Security Considerations for Grid Management

    Science.gov (United States)

    Eidson, Thomas M.

    2003-01-01

    The use of Grid computing environments is growing in popularity. A Grid computing environment is primarily a wide area network that encompasses multiple local area networks, where some of the local area networks are managed by different organizations. A Grid computing environment also includes common interfaces for distributed computing software so that the heterogeneous set of machines that make up the Grid can be used more easily. The other key feature of a Grid is that the distributed computing software includes appropriate security technology. The focus of most Grid software is on the security involved with application execution, file transfers, and other remote computing procedures. However, there are other important security issues related to the management of a Grid and the users who use that Grid. This note discusses these additional security issues and makes several suggestions as how they can be managed.

  6. "Computer Science Can Feed a Lot of Dreams"

    Science.gov (United States)

    Educational Horizons, 2014

    2014-01-01

    Pat Yongpradit is the director of education at Code.org. He leads all education efforts, including professional development and curriculum creation, and he builds relationships with school districts. Pat joined "Educational Horizons" to talk about why it is important to teach computer science--even for non-computer science teachers. This…

  7. Strategic directions of computing at Fermilab

    Science.gov (United States)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  8. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  9. A phenomenographic study of the ways of understanding conditional and repetition structures in computer programming languages

    Science.gov (United States)

    Bucks, Gregory Warren

    Computers have become an integral part of how engineers complete their work, allowing them to collect and analyze data, model potential solutions and aiding in production through automation and robotics. In addition, computers are essential elements of the products themselves, from tennis shoes to construction materials. An understanding of how computers function, both at the hardware and software level, is essential for the next generation of engineers. Despite the need for engineers to develop a strong background in computing, little opportunity is given for engineering students to develop these skills. Learning to program is widely seen as a difficult task, requiring students to develop not only an understanding of specific concepts, but also a way of thinking. In addition, students are forced to learn a new tool, in the form of the programming environment employed, along with these concepts and thought processes. Because of this, many students will not develop a sufficient proficiency in programming, even after progressing through the traditional introductory programming sequence. This is a significant problem, especially in the engineering disciplines, where very few students receive more than one or two semesters' worth of instruction in an already crowded engineering curriculum. To address these issues, new pedagogical techniques must be investigated in an effort to enhance the ability of engineering students to develop strong computing skills. However, these efforts are hindered by the lack of published assessment instruments available for probing an individual's understanding of programming concepts across programming languages. Traditionally, programming knowledge has been assessed by producing written code in a specific language. This can be an effective method, but does not lend itself well to comparing the pedagogical impact of different programming environments, languages or paradigms. This dissertation presents a phenomenographic research study

  10. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    International Nuclear Information System (INIS)

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth

  11. Computational Nuclear Quantum Many-Body Problem: The UNEDF Project

    OpenAIRE

    Bogner, Scott; Bulgac, Aurel; Carlson, Joseph A.; Engel, Jonathan; Fann, George; Furnstahl, Richard J.; Gandolfi, Stefano; Hagen, Gaute; Horoi, Mihai; Johnson, Calvin W.; Kortelainen, Markus; Lusk, Ewing; Maris, Pieter; Nam, Hai Ah; Navratil, Petr

    2013-01-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  12. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  13. New Mandatory Computer Security Course

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Just like any other organization, CERN is permanently under attack - even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Since 2007, newcomers have to follow a dedicated basic computer security course informing them about the “Do’s” and “Dont’s” when using CERNs computing facilities. This course has recently been redesigned. It is now mandatory for all CERN members (users and staff) owning a CERN computer account and must be followed once every three years. Members who...

  14. The European computer model for optronic system performance prediction (ECOMOS)

    Science.gov (United States)

    Keßler, Stefan; Bijl, Piet; Labarre, Luc; Repasi, Endre; Wittenstein, Wolfgang; Bürsing, Helge

    2017-10-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short discussion of validation tests and an outlook on the future potential of simulation for sensor assessment.

  15. DOE pushes for useful quantum computing

    Science.gov (United States)

    Cho, Adrian

    2018-01-01

    The U.S. Department of Energy (DOE) is joining the quest to develop quantum computers, devices that would exploit quantum mechanics to crack problems that overwhelm conventional computers. The initiative comes as Google and other companies race to build a quantum computer that can demonstrate "quantum supremacy" by beating classical computers on a test problem. But reaching that milestone will not mean practical uses are at hand, and the new $40 million DOE effort is intended to spur the development of useful quantum computing algorithms for its work in chemistry, materials science, nuclear physics, and particle physics. With the resources at its 17 national laboratories, DOE could play a key role in developing the machines, researchers say, although finding problems with which quantum computers can help isn't so easy.

  16. Incentive Design and Mis-Allocated Effort

    OpenAIRE

    Schnedler, Wendelin

    2013-01-01

    Incentives often distort behavior: they induce agents to exert effort but this effort is not employed optimally. This paper proposes a theory of incentive design allowing for such distorted behavior. At the heart of the theory is a trade-off between getting the agent to exert effort and ensuring that this effort is used well. The theory covers various moral-hazard models, ranging from traditional single-task to multi-task models. It also provides -for the first time- a formalization and proof...

  17. 40 CFR 33.302 - Are there any additional contract administration requirements?

    Science.gov (United States)

    2010-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime... the subcontract for any reason, the recipient must require the prime contractor to employ the six good...

  18. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  19. Noise Characterization of Devices for Optical Computing

    National Research Council Canada - National Science Library

    Walkup, John

    1998-01-01

    The major objective of the research effort is to investigate the noise characteristics of advanced optical Sources, spatial light modulators, and other devices which are candidates for applications in optical computers...

  20. Workshop on Computational Optimization

    CERN Document Server

    2015-01-01

    Our everyday life is unthinkable without optimization. We try to minimize our effort and to maximize the achieved profit. Many real world and industrial problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2013. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, resource constrained project scheduling, problems arising in transport services, error correcting codes, optimal system performance and energy consumption and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others.

  1. Estimation of total Effort and Effort Elapsed in Each Step of Software Development Using Optimal Bayesian Belief Network

    Directory of Open Access Journals (Sweden)

    Fatemeh Zare Baghiabad

    2017-09-01

    Full Text Available Accuracy in estimating the needed effort for software development caused software effort estimation to be a challenging issue. Beside estimation of total effort, determining the effort elapsed in each software development step is very important because any mistakes in enterprise resource planning can lead to project failure. In this paper, a Bayesian belief network was proposed based on effective components and software development process. In this model, the feedback loops are considered between development steps provided that the return rates are different for each project. Different return rates help us determine the percentages of the elapsed effort in each software development step, distinctively. Moreover, the error measurement resulted from optimized effort estimation and the optimal coefficients to modify the model are sought. The results of the comparison between the proposed model and other models showed that the model has the capability to highly accurately estimate the total effort (with the marginal error of about 0.114 and to estimate the effort elapsed in each software development step.

  2. Both male and female identity influence variation in male signalling effort

    Directory of Open Access Journals (Sweden)

    Svensson P Andreas

    2011-08-01

    Full Text Available Abstract Background Male sexual displays play an important role in sexual selection by affecting reproductive success. However, for such displays to be useful for female mate choice, courtship should vary more among than within individual males. In this regard, a potentially important source of within male variation is adjustment of male courtship effort in response to female traits. Accordingly, we set out to dissect sources of variation in male courtship effort in a fish, the desert goby (Chlamydogobius eremius. We did so by designing an experiment that allowed simultaneous estimation of within and between male variation in courtship, while also assessing the importance of the males and females as sources of courtship variation. Results Although males adjusted their courtship depending on the identity of the female (a potentially important source of within-male variation, among-male differences were considerably greater. In addition, male courtship effort towards a pair of females was highly repeatable over a short time frame. Conclusion Despite the plasticity in male courtship effort, courtship displays had the potential to reliably convey information about the male to mate-searching females. Our experiment therefore underscores the importance of addressing the different sources contributing to variation in the expression of sexually-selected traits.

  3. Agent-Based Computing: Promise and Perils

    OpenAIRE

    Jennings, N. R.

    1999-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more genrally, Computer Science. It has the potential to significantly improve the theory and practice of modelling, designing and implementing complex systems. Yet, to date, there has been little systematic analysis of what makes an agent such an appealing and powerful conceptual model. Moreover, even less effort has been devoted to exploring the inherent disadvantages that stem from adoptin...

  4. Office of Fusion Energy computational review

    International Nuclear Information System (INIS)

    Cohen, B.I.; Cohen, R.H.; Byers, J.A.

    1996-01-01

    The LLNL MFE Theory and Computations Program supports computational efforts in the following areas: (1) Magnetohydrodynamic equilibrium and stability; (2) Fluid and kinetic edge plasma simulation and modeling; (3) Kinetic and fluid core turbulent transport simulation; (4) Comprehensive tokamak modeling (CORSICA Project) - transport, MHD equilibrium and stability, edge physics, heating, turbulent transport, etc. and (5) Other: ECRH ray tracing, reflectometry, plasma processing. This report discusses algorithm and codes pertaining to these areas

  5. Advances in Computer Science and Education

    CERN Document Server

    Huang, Xiong

    2012-01-01

    CSE2011 is an integrated conference concentration its focus on computer science and education. In the proceeding, you can learn much more knowledge about computer science and education of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful

  6. On the additive splitting procedures and their computer realization

    DEFF Research Database (Denmark)

    Farago, I.; Thomsen, Per Grove; Zlatev, Z.

    2008-01-01

    Two additive splitting procedures are defined and studied in this paper. It is shown that these splitting procedures have good stability properties. Some other splitting procedures, which are traditionally used in mathematical models used in many scientific and engineering fields, are sketched. All...

  7. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  8. Measuring listening effort: driving simulator versus simple dual-task paradigm.

    Science.gov (United States)

    Wu, Yu-Hsiang; Aksan, Nazan; Rizzo, Matthew; Stangl, Elizabeth; Zhang, Xuyang; Bentler, Ruth

    2014-01-01

    The dual-task paradigm has been widely used to measure listening effort. The primary objectives of the study were to (1) investigate the effect of hearing aid amplification and a hearing aid directional technology on listening effort measured by a complicated, more real world dual-task paradigm and (2) compare the results obtained with this paradigm to a simpler laboratory-style dual-task paradigm. The listening effort of adults with hearing impairment was measured using two dual-task paradigms, wherein participants performed a speech recognition task simultaneously with either a driving task in a simulator or a visual reaction-time task in a sound-treated booth. The speech materials and road noises for the speech recognition task were recorded in a van traveling on the highway in three hearing aid conditions: unaided, aided with omnidirectional processing (OMNI), and aided with directional processing (DIR). The change in the driving task or the visual reaction-time task performance across the conditions quantified the change in listening effort. Compared to the driving-only condition, driving performance declined significantly with the addition of the speech recognition task. Although the speech recognition score was higher in the OMNI and DIR conditions than in the unaided condition, driving performance was similar across these three conditions, suggesting that listening effort was not affected by amplification and directional processing. Results from the simple dual-task paradigm showed a similar trend: hearing aid technologies improved speech recognition performance, but did not affect performance in the visual reaction-time task (i.e., reduce listening effort). The correlation between listening effort measured using the driving paradigm and the visual reaction-time task paradigm was significant. The finding showing that our older (56 to 85 years old) participants' better speech recognition performance did not result in reduced listening effort was not

  9. Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.

    Science.gov (United States)

    Caruso, Ronald D

    2003-01-01

    Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003

  10. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    International Nuclear Information System (INIS)

    Eyler, L.L.; Trent, D.S.; Budden, M.J.

    1983-09-01

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs

  11. Why don't you try harder? An investigation of effort production in major depression.

    Directory of Open Access Journals (Sweden)

    Marie-Laure Cléry-Melin

    Full Text Available Depression is mainly characterized as an emotional disorder, associated with reduced approach behavior. It remains unclear whether the difficulty in energising behavior relates to abnormal emotional states or to a flattened response to potential rewards, as suggested by several neuroimaging studies. Here, we aimed to demonstrate a specific incentive motivation deficit in major depression, independent of patients' emotional state. We employed a behavioral paradigm designed to measure physical effort in response to both emotional modulation and incentive motivation. Patients did exert more effort following emotionally arousing pictures (whether positive or negative but not for higher monetary incentives, contrary to healthy controls. These results show that emotional and motivational sources of effort production are dissociable in pathological conditions. In addition, patients' ratings of perceived effort increased for high incentives, whereas controls' ratings were decreased. Thus, depressed patients objectively behave as if they do not want to gain larger rewards, but subjectively feel that they try harder. We suggest that incentive motivation impairment is a core deficit of major depression, which may render everyday tasks abnormally effortful for patients.

  12. Mental and physical effort affect vigilance differently

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  13. Mental and physical effort affect vigilance differently.

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  14. A Composite Contract for Coordinating a Supply Chain with Price and Effort Dependent Stochastic Demand

    Directory of Open Access Journals (Sweden)

    Yu-Shuang Liu

    2016-01-01

    Full Text Available As the demand is more sensitive to price and sales effort, this paper investigates the issue of channel coordination for a supply chain with one manufacturer and one retailer facing price and effort dependent stochastic demand. A composite contract based on the quantity-restricted returns and target sales rebate can achieve coordination in this setting. Two main problems are addressed: (1 how to coordinate the decentralized supply chain; (2 how to determine the optimal sales effort level, pricing, and inventory decisions under the additive demand case. Numerical examples are presented to verify the effectiveness of combined contract in supply chain coordination and highlight model sensitivities to parametric changes.

  15. Reminder: Mandatory Computer Security Course

    CERN Multimedia

    IT Department

    2011-01-01

    Just like any other organization, CERN is permanently under attack – even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Therefore, a new dedicated basic computer security course has been designed informing you about the “Do’s” and “Dont’s” when using CERN's computing facilities. This course is mandatory for all person owning a CERN computer account and must be followed once every three years. Users who have never done the course, or whose course needs to be renewe...

  16. Cryogenic Fluid Storage Technology Development: Recent and Planned Efforts at NASA

    Science.gov (United States)

    Moran, Matthew E.

    2009-01-01

    Recent technology development work conducted at NASA in the area of Cryogenic Fluid Management (CFM) storage is highlighted, including summary results, key impacts, and ongoing efforts. Thermodynamic vent system (TVS) ground test results are shown for hydrogen, methane, and oxygen. Joule-Thomson (J-T) device tests related to clogging in hydrogen are summarized, along with the absence of clogging in oxygen and methane tests. Confirmation of analytical relations and bonding techniques for broad area cooling (BAC) concepts based on tube-to-tank tests are presented. Results of two-phase lumped-parameter computational fluid dynamic (CFD) models are highlighted, including validation of the model with hydrogen self pressurization test data. These models were used to simulate Altair representative methane and oxygen tanks subjected to 210 days of lunar surface storage. Engineering analysis tools being developed to support system level trades and vehicle propulsion system designs are also cited. Finally, prioritized technology development risks identified for Constellation cryogenic propulsion systems are presented, and future efforts to address those risks are discussed.

  17. Low-effort thought promotes political conservatism.

    Science.gov (United States)

    Eidelman, Scott; Crandall, Christian S; Goodman, Jeffrey A; Blanchar, John C

    2012-06-01

    The authors test the hypothesis that low-effort thought promotes political conservatism. In Study 1, alcohol intoxication was measured among bar patrons; as blood alcohol level increased, so did political conservatism (controlling for sex, education, and political identification). In Study 2, participants under cognitive load reported more conservative attitudes than their no-load counterparts. In Study 3, time pressure increased participants' endorsement of conservative terms. In Study 4, participants considering political terms in a cursory manner endorsed conservative terms more than those asked to cogitate; an indicator of effortful thought (recognition memory) partially mediated the relationship between processing effort and conservatism. Together these data suggest that political conservatism may be a process consequence of low-effort thought; when effortful, deliberate thought is disengaged, endorsement of conservative ideology increases.

  18. Initiatives to Improve Quality of Additively Manufactured Parts

    Science.gov (United States)

    Waller, Jess; Nichols, Charles

    2017-01-01

    NASA is providing leadership in an international effort linking government and industry resources to speed adoption of additive manufactured (AM) parts. Participants include government agencies (NASA, USAF, NIST, FAA), industry (commercial aerospace, NDE manufacturers, AM equipment manufacturers), standards organizations and academia. NASA is also partnering with its international space exploration organizations such as ESA and JAXA. NDT is identified as a universal need for all aspects of additive manufacturing.

  19. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  20. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  1. Goal striving strategies and effort mobilization: When implementation intentions reduce effort-related cardiac activity during task performance.

    Science.gov (United States)

    Freydefont, Laure; Gollwitzer, Peter M; Oettingen, Gabriele

    2016-09-01

    Two experiments investigate the influence of goal and implementation intentions on effort mobilization during task performance. Although numerous studies have demonstrated the beneficial effects of setting goals and making plans on performance, the effects of goals and plans on effort-related cardiac activity and especially the cardiac preejection period (PEP) during goal striving have not yet been addressed. According to the Motivational Intensity Theory, participants should increase effort mobilization proportionally to task difficulty as long as success is possible and justified. Forming goals and making plans should allow for reduced effort mobilization when participants perform an easy task. However, when the task is difficult, goals and plans should differ in their effect on effort mobilization. Participants who set goals should disengage, whereas participants who made if-then plans should stay in the field showing high effort mobilization during task performance. As expected, using an easy task in Experiment 1, we observed a lower cardiac PEP in both the implementation intention and the goal intention condition than in the control condition. In Experiment 2, we varied task difficulty and demonstrated that while participants with a mere goal intention disengaged from difficult tasks, participants with an implementation intention increased effort mobilization proportionally with task difficulty. These findings demonstrate the influence of goal striving strategies (i.e., mere goals vs. if-then plans) on effort mobilization during task performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Strategic directions of computing at Fermilab

    International Nuclear Information System (INIS)

    Wolbers, S.

    1997-04-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R ampersand D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object- oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and project. R ampersand D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing

  3. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  4. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  5. Effects of Transformational and Transactional Leadership on Cognitive Effort and Outcomes during Collaborative Learning within a Virtual World

    Science.gov (United States)

    Kahai, Surinder; Jestire, Rebecca; Huang, Rui

    2013-01-01

    Computer-supported collaborative learning is a common e-learning activity. Instructors have to create appropriate social and instructional interventions in order to promote effective learning. We performed a study that examined the effects of two popular leadership interventions, transformational and transactional, on cognitive effort and outcomes…

  6. Computational Infrastructure for Geodynamics (CIG)

    Science.gov (United States)

    Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.

    2004-12-01

    Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to

  7. Effort, anhedonia, and function in schizophrenia: reduced effort allocation predicts amotivation and functional impairment.

    Science.gov (United States)

    Barch, Deanna M; Treadway, Michael T; Schoen, Nathan

    2014-05-01

    One of the most debilitating aspects of schizophrenia is an apparent interest in or ability to exert effort for rewards. Such "negative symptoms" may prevent individuals from obtaining potentially beneficial outcomes in educational, occupational, or social domains. In animal models, dopamine abnormalities decrease willingness to work for rewards, implicating dopamine (DA) function as a candidate substrate for negative symptoms given that schizophrenia involves dysregulation of the dopamine system. We used the effort-expenditure for rewards task (EEfRT) to assess the degree to which individuals with schizophrenia were wiling to exert increased effort for either larger magnitude rewards or for rewards that were more probable. Fifty-nine individuals with schizophrenia and 39 demographically similar controls performed the EEfRT task, which involves making choices between "easy" and "hard" tasks to earn potential rewards. Individuals with schizophrenia showed less of an increase in effort allocation as either reward magnitude or probability increased. In controls, the frequency of choosing the hard task in high reward magnitude and probability conditions was negatively correlated with depression severity and anhedonia. In schizophrenia, fewer hard task choices were associated with more severe negative symptoms and worse community and work function as assessed by a caretaker. Consistent with patterns of disrupted dopamine functioning observed in animal models of schizophrenia, these results suggest that 1 mechanism contributing to impaired function and motivational drive in schizophrenia may be a reduced allocation of greater effort for higher magnitude or higher probability rewards.

  8. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  9. Measuring cognitive load: performance, mental effort and simulation task complexity.

    Science.gov (United States)

    Haji, Faizal A; Rojas, David; Childs, Ruth; de Ribaupierre, Sandrine; Dubrowski, Adam

    2015-08-01

    Interest in applying cognitive load theory in health care simulation is growing. This line of inquiry requires measures that are sensitive to changes in cognitive load arising from different instructional designs. Recently, mental effort ratings and secondary task performance have shown promise as measures of cognitive load in health care simulation. We investigate the sensitivity of these measures to predicted differences in intrinsic load arising from variations in task complexity and learner expertise during simulation-based surgical skills training. We randomly assigned 28 novice medical students to simulation training on a simple or complex surgical knot-tying task. Participants completed 13 practice trials, interspersed with computer-based video instruction. On trials 1, 5, 9 and 13, knot-tying performance was assessed using time and movement efficiency measures, and cognitive load was assessed using subjective rating of mental effort (SRME) and simple reaction time (SRT) on a vibrotactile stimulus-monitoring secondary task. Significant improvements in knot-tying performance (F(1.04,24.95)  = 41.1, p cognitive load (F(2.3,58.5)  = 57.7, p load among novices engaged in simulation-based learning. These measures can be used to track cognitive load during skills training. Mental effort ratings are also sensitive to small differences in intrinsic load arising from variations in the physical complexity of a simulation task. The complementary nature of these subjective and objective measures suggests their combined use is advantageous in simulation instructional design research. © 2015 John Wiley & Sons Ltd.

  10. The European computer model for optronic system performance prediction (ECOMOS)

    NARCIS (Netherlands)

    Kessler, S.; Bijl, P.; Labarre, L.; Repasi, E.; Wittenstein, W.; Bürsing, H.

    2017-01-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The

  11. Enabling Earth Science Through Cloud Computing

    Science.gov (United States)

    Hardman, Sean; Riofrio, Andres; Shams, Khawaja; Freeborn, Dana; Springer, Paul; Chafin, Brian

    2012-01-01

    Cloud Computing holds tremendous potential for missions across the National Aeronautics and Space Administration. Several flight missions are already benefiting from an investment in cloud computing for mission critical pipelines and services through faster processing time, higher availability, and drastically lower costs available on cloud systems. However, these processes do not currently extend to general scientific algorithms relevant to earth science missions. The members of the Airborne Cloud Computing Environment task at the Jet Propulsion Laboratory have worked closely with the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to integrate cloud computing into their science data processing pipeline. This paper details the efforts involved in deploying a science data system for the CARVE mission, evaluating and integrating cloud computing solutions with the system and porting their science algorithms for execution in a cloud environment.

  12. How do different components of Effortful Control contribute to children's mathematics achievement?

    Directory of Open Access Journals (Sweden)

    Noelia eSánchez-Pérez

    2015-09-01

    Full Text Available This work sought to investigate the specific contribution of two different components of Effortful Control -attentional focusing and inhibitory control- to children’s mathematics achievement. The sample was composed of 142 children aged 9 to 12 years old. Effortful Control components were measured through the Temperament in Middle Childhood Questionnaire (TMCQ; parent´s report; math achievement was measured via teacher’s report and through the standard Woodcock-Johnson test. Additionally, the contribution of other cognitive and socio-emotional processes was taken into account. Our results showed that only attentional focusing significantly contributed to the variance of children’s mathematics achievement; interestingly, mediational models showed that the relationship between effortful attentional self-regulation and mathematics achievement was mediated by academic peer popularity, as well as by intelligence and study skills. Results are discussed in the light of the current theories on the role of children’ self-regulation abilities in the context of school.

  13. STAR Infrastructure Database: An effort to know each other

    Energy Technology Data Exchange (ETDEWEB)

    Mora, J.C.; Real, Almudena [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Vesterbacka, Pia; Outola, Iisa [STUK - Radiation and Nuclear Safety Authority (Finland); Barnett, Catherine; Beresford, Nick [Natural Environment Research Council - NERC-CEH (United Kingdom); Bradshaw, Clare [Stockholm University (Sweden); Skipperud, Lindis [Norwegian University of Life Sciences - UMB (Norway); Wilrodt, Christine; Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Vanhoudt, Nathalie [Belgian Nuclear Research Centre SCK-CEN (Belgium); Komperoed, Mari [Norwegian Radiation Protection Authority - NRPA (Norway); Gurriaran, Rodolfo; Gilbin, Rodolphe; Hinton, Thomas [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)

    2014-07-01

    Effort over the last decade to make radioecology stronger and sustainable within Europe crystallized in the creation of the European Radioecology Alliance. The first step for this integrative effort was the establishment of a network of excellence (NoE) under the EU FP7 Strategy for Allied Radioecology (STAR www.star-radioecology.org) project which commenced in 2011. One of the project objectives was to share knowledge of European radioecological capabilities. To help achieve this, a register of these capabilities at each of the STAR laboratories has been created. An Infrastructure Database was designed and programmed using web 2.0 technologies on a 'wiki' platform. Its intended use was to identify what assets were held and where improvements could be made. Information collated includes an inventory of the radioanalytical or conventional equipment and methods, bio-informatics equipment and methods, sample and data archives held, and models and codes used. It also provides a summary of the radioecological expertise of the 170 radio-ecologists at STAR institutes whose knowledge is wide-ranging and encompasses: atmospheric dispersion, dosimetry, ecology, ecotoxicology, environmental radiation protection, environmental surveillance, foodstuffs, terrestrial, freshwater and marine radioecology, modelling, radiobiology and radionuclide analyses, emergency preparedness, education and training, amongst others. In 2013, the EU FP7 Coordination and implementation of a pan-European instrument for radioecology (COMET, www.comet-radioecology.org) project, involving the STAR partners and additionally one Japanese and two Ukrainian research institutes, was initiated. The capabilities of these additional partners will be added to the database in 2014. The aim of the database was to gather information to: - avoid duplication of effort and thereby increase efficiency, - improve synergy and collaboration between the STAR project partners and others involved in

  14. Musculoskeletal pain and effort-reward imbalance--a systematic review.

    Science.gov (United States)

    Koch, Peter; Schablon, Anja; Latza, Ute; Nienhaus, Albert

    2014-01-15

    Musculoskeletal pain may be triggered by physical strains and psychosocial risk factors. The effort-reward imbalance model (ERI model) is a stress model which measures psychosocial factors in the working world. The question is whether workers with an effort-reward imbalance report musculoskeletal pain more frequently than those with no effort-reward imbalance. A systematic review using a best evidence synthesis approach was conducted to answer this question. A literature search was conducted for the period from 1996 to 2012, using three databases (Pubmed, Embase and PsycINFO). The research criteria related to psychosocial, work-related stress as per the ERI model and to musculoskeletal pain. A quality score was developed using various quality criteria to assess the standard of the studies. The level of evidence was graded as in (Am J Ind Med 39:180-193, 2001). After applying the inclusion criteria, a total of 19 studies were included in the review: 15 cross-sectional studies, three prospective studies and one case-control study. 74% of all studies exhibited good methodological quality, 53% collected data using the original ERI questionnaire, and in 42% of the studies, there was adequate control for physical working conditions. Furthermore, different cut-off points were used to classify exposed and non-exposed individuals. On the basis of 13 studies with a positive, statistically significant association, a moderate level of evidence was inferred for the association between effort-reward imbalance and musculoskeletal pain. The evidence for a role of over-commitment and for its interaction with effort-reward imbalance was rated as inconclusive - on the basis of eight and five studies, respectively. On the basis of the available evidence, no reliable conclusion may be drawn about any association between the psychosocial factors ascertained using the ERI model and musculoskeletal pain. Before a reliable statement can be made on the association between ERI and

  15. Age-related incidence of pulmonary embolism and additional pathologic findings detected by computed tomography pulmonary angiography

    Energy Technology Data Exchange (ETDEWEB)

    Groth, M., E-mail: groth.michael@googlemail.com [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Henes, F.O., E-mail: f.henes@uke.uni-hamburg.de [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Mayer, U., E-mail: mayer@uke.uni-hamburg.de [Emergency Department, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Regier, M., E-mail: m.regier@uke.uni-hamburg.de [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Adam, G., E-mail: g.adam@uke.uni-hamburg.de [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Begemann, P.G.C., E-mail: p.begemann@me.com [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany)

    2012-08-15

    Objective: To compare the incidence of pulmonary embolism (PE) and additional pathologic findings (APF) detected by computed tomography pulmonary angiography (CTPA) according to different age-groups. Materials and methods: 1353 consecutive CTPA cases for suspected PE were retrospectively reviewed. Patients were divided into seven age groups: {<=}29, 30-39, 40-49, 50-59, 60-69, 70-79 and {>=}80 years. Differences between the groups were tested using Fisher's exact or chi-square test. A p-value < 0.0024 indicated statistical significance when Bonferroni correction was used. Results: Incidence rates of PE ranged from 11.4% to 25.4% in different age groups. The three main APF were pleural effusion, pneumonia and pulmonary nodules. No significant difference was found between the incidences of PE in different age groups. Furthermore, APF in different age groups revealed no significant differences (all p-values > 0.0024). Conclusion: The incidences of PE and APF detected by CTPA reveal no significant differences between various age groups.

  16. Designing and manufacturing an auricular prosthesis using computed tomography, 3-dimensional photographic imaging, and additive manufacturing: a clinical report.

    Science.gov (United States)

    Liacouras, Peter; Garnes, Jonathan; Roman, Norberto; Petrich, Anton; Grant, Gerald T

    2011-02-01

    The method of fabricating an auricular prosthesis by digitally positioning a mirror image of the soft tissue, then designing and using rapid prototyping to produce the mold, can reduce the steps and time needed to create a prosthesis by the traditional approach of sculpting either wax or clay. The purpose of this clinical report is to illustrate how the use of 3-dimensional (3-D) photography, computer technology, and additive manufacturing can extensively reduce many of the preliminary procedures currently used to create an auricular prosthesis. Copyright © 2011 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  17. States and compacts: Issues and events affecting facility development efforts, including the Barnwell opening

    Energy Technology Data Exchange (ETDEWEB)

    Larson, G.S.

    1995-12-31

    Ten years have passed since the first regional low-level radioactive waste compacts received Congressional consent and initiated their efforts to develop new disposal capacity. During these 10 years, both significant achievements and serious setbacks have marked our efforts and affect our current outlook. Recent events in the waste marketplace, particularly in the operating status of the Barnwell disposal facility, have now raised legitimate questions about the continued rationale for the regional framework that grew out of the original legislation enacted by Congress in 1980. At the same time, licensing activities for new regional disposal facilities are under way in three states, and a fourth awaits the final go-ahead to begin construction. Uncertainty over the meaning and reliability of the marketplace events makes it difficult to gauge long-term implications. In addition, differences in the status of individual state and compact facility development efforts lead to varying assessments of the influence these events will, or should, have on such efforts.

  18. States and compacts: Issues and events affecting facility development efforts, including the Barnwell opening

    International Nuclear Information System (INIS)

    Larson, G.S.

    1995-01-01

    Ten years have passed since the first regional low-level radioactive waste compacts received Congressional consent and initiated their efforts to develop new disposal capacity. During these 10 years, both significant achievements and serious setbacks have marked our efforts and affect our current outlook. Recent events in the waste marketplace, particularly in the operating status of the Barnwell disposal facility, have now raised legitimate questions about the continued rationale for the regional framework that grew out of the original legislation enacted by Congress in 1980. At the same time, licensing activities for new regional disposal facilities are under way in three states, and a fourth awaits the final go-ahead to begin construction. Uncertainty over the meaning and reliability of the marketplace events makes it difficult to gauge long-term implications. In addition, differences in the status of individual state and compact facility development efforts lead to varying assessments of the influence these events will, or should, have on such efforts

  19. Quality-oriented efforts in IPD, - a framework

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    1998-01-01

    It is generally expected that modern quality efforts like TQM and ISO9000 should deliver a sufficient framework for quality efforts in industrial companies. Our findings in Danish industry shows a fragmented picture of islands of efforts and a weak understanding of basic quality concepts between...... designers. The paper propose a framework for quality efforts, illustrated by simple metaphors....

  20. Mission: Define Computer Literacy. The Illinois-Wisconsin ISACS Computer Coordinators' Committee on Computer Literacy Report (May 1985).

    Science.gov (United States)

    Computing Teacher, 1985

    1985-01-01

    Defines computer literacy and describes a computer literacy course which stresses ethics, hardware, and disk operating systems throughout. Core units on keyboarding, word processing, graphics, database management, problem solving, algorithmic thinking, and programing are outlined, together with additional units on spreadsheets, simulations,…

  1. Remote handling prospects. Computer aided remote handling

    International Nuclear Information System (INIS)

    Vertut, J.

    1984-01-01

    Mechanical manipulators, electrical control manipulators and computer aided manipulators were successively developed. The aim of computer aided manipulators is the realization of complex or tricky job in adverse environment but man is required for non routine work or for situation in evolution. French effort is developed in the frame of the project automation and advanced robotics and new problems have to be solved particularly at the interface man/machine [fr

  2. Learning Environment and Student Effort

    Science.gov (United States)

    Hopland, Arnt O.; Nyhus, Ole Henning

    2016-01-01

    Purpose: The purpose of this paper is to explore the relationship between satisfaction with learning environment and student effort, both in class and with homework assignments. Design/methodology/approach: The authors use data from a nationwide and compulsory survey to analyze the relationship between learning environment and student effort. The…

  3. Coordinating a Supply Chain with a Loss-Averse Retailer and Effort Dependent Demand

    Science.gov (United States)

    Li, Liying

    2014-01-01

    This study investigates the channel coordination issue of a supply chain with a risk-neutral manufacturer and a loss-averse retailer facing stochastic demand that is sensitive to sales effort. Under the loss-averse newsvendor setting, a distribution-free gain/loss-sharing-and-buyback (GLB) contract has been shown to be able to coordinate the supply chain. However, we find that a GLB contract remains ineffective in managing the supply chain when retailer sales efforts influence the demand. To effectively coordinate the channel, we propose to combine a GLB contract with sales rebate and penalty (SRP) contract. In addition, we discover a special class of gain/loss contracts that can coordinate the supply chain and arbitrarily allocate the expected supply chain profit between the manufacturer and the retailer. We then analyze the effect of loss aversion on the retailer's decision-making behavior and supply chain performance. Finally, we perform a numerical study to illustrate the findings and gain additional insights. PMID:25197696

  4. Coordinating a supply chain with a loss-averse retailer and effort dependent demand.

    Science.gov (United States)

    Li, Liying; Wang, Yong

    2014-01-01

    This study investigates the channel coordination issue of a supply chain with a risk-neutral manufacturer and a loss-averse retailer facing stochastic demand that is sensitive to sales effort. Under the loss-averse newsvendor setting, a distribution-free gain/loss-sharing-and-buyback (GLB) contract has been shown to be able to coordinate the supply chain. However, we find that a GLB contract remains ineffective in managing the supply chain when retailer sales efforts influence the demand. To effectively coordinate the channel, we propose to combine a GLB contract with sales rebate and penalty (SRP) contract. In addition, we discover a special class of gain/loss contracts that can coordinate the supply chain and arbitrarily allocate the expected supply chain profit between the manufacturer and the retailer. We then analyze the effect of loss aversion on the retailer's decision-making behavior and supply chain performance. Finally, we perform a numerical study to illustrate the findings and gain additional insights.

  5. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    Science.gov (United States)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  6. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. From Computer-interpretable Guidelines to Computer-interpretable Quality Indicators: A Case for an Ontology.

    Science.gov (United States)

    White, Pam; Roudsari, Abdul

    2014-01-01

    In the United Kingdom's National Health Service, quality indicators are generally measured electronically by using queries and data extraction, resulting in overlap and duplication of query components. Electronic measurement of health care quality indicators could be improved through an ontology intended to reduce duplication of effort during healthcare quality monitoring. While much research has been published on ontologies for computer-interpretable guidelines, quality indicators have lagged behind. We aimed to determine progress on the use of ontologies to facilitate computer-interpretable healthcare quality indicators. We assessed potential for improvements to computer-interpretable healthcare quality indicators in England. We concluded that an ontology for a large, diverse set of healthcare quality indicators could benefit the NHS and reduce workload, with potential lessons for other countries.

  9. The role of additional computed tomography in the decision-making process on the secondary prevention in patients after systemic cerebral thrombolysis

    Directory of Open Access Journals (Sweden)

    Sobolewski P

    2015-12-01

    Full Text Available Piotr Sobolewski,1 Grzegorz Kozera,2 Wiktor Szczuchniak,1 Walenty M Nyka2 1Department of Neurology and Stroke, Unit of Holy Spirit Specialist Hospital in Sandomierz, Sandomierz, Poland; 2Department of Neurology, Medical University of Gdańsk, Gdańsk, Poland Introduction: Patients with ischemic stroke undergoing intravenous (iv-thrombolysis are routinely controlled with computed tomography on the second day to assess stroke evolution and hemorrhagic transformation (HT. However, the benefits of an additional computed tomography (aCT performed over the next days after iv-thrombolysis have not been determined.Methods: We retrospectively screened 287 Caucasian patients with ischemic stroke who were consecutively treated with iv-thrombolysis from 2008 to 2012. The results of computed tomography performed on the second (control computed tomography and seventh (aCT day after iv-thrombolysis were compared in 274 patients (95.5%; 13 subjects (4.5%, who died before the seventh day from admission were excluded from the analysis.Results: aCTs revealed a higher incidence of HT than control computed tomographies (14.2% vs 6.6%; P=0.003. Patients with HT in aCT showed higher median of National Institutes of Health Stroke Scale score on admission than those without HT (13.0 vs 10.0; P=0.01 and higher presence of ischemic changes >1/3 middle cerebral artery territory (66.7% vs 35.2%; P<0.01. Correlations between presence of HT in aCT and National Institutes of Health Stroke Scale score on admission (rpbi 0.15; P<0.01, and the ischemic changes >1/3 middle cerebral artery (phi=0.03 existed, and the presence of HT in aCT was associated with 3-month mortality (phi=0.03.Conclusion: aCT after iv-thrombolysis enables higher detection of HT, which is related to higher 3-month mortality. Thus, patients with severe middle cerebral artery infarction may benefit from aCT in the decision-making process on the secondary prophylaxis. Keywords: ischemic stroke, iv

  10. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  11. Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.

  12. Solving Ratio-Dependent Predator-Prey System with Constant Effort Harvesting Using Homotopy Perturbation Method

    Directory of Open Access Journals (Sweden)

    Abdoul R. Ghotbi

    2008-01-01

    Full Text Available Due to wide range of interest in use of bioeconomic models to gain insight into the scientific management of renewable resources like fisheries and forestry, homotopy perturbation method is employed to approximate the solution of the ratio-dependent predator-prey system with constant effort prey harvesting. The results are compared with the results obtained by Adomian decomposition method. The results show that, in new model, there are less computations needed in comparison to Adomian decomposition method.

  13. Fulfilling the vision of automatic computing

    OpenAIRE

    Dobson, Simon; Sterritt, Roy; Nixon, Paddy; Hinchey, Mike

    2010-01-01

    Efforts since 2001 to design self-managing systems have yielded many impressive achievements, yet the original vision of autonomic computing remains unfulfilled. Researchers must develop a comprehensive systems engineering approach to create effective solutions for next-generation enterprise and sensor systems. Publisher PDF Peer reviewed

  14. European questionnaire on the use of computer programmes in radiation dosimetry

    International Nuclear Information System (INIS)

    Gualdrini, G.; Tanner, R.; Terrisol, M.

    1999-01-01

    Because of a potential reduction of necessary experimental efforts, the combination of measurements and supplementing calculations, also in the field of radiation dosimetry, may allow time and money to be saved if computational methods are used which are well suited to reproduce experimental data in a satisfactory quality. The dramatic increase in computing power in recent years now permits the use of computational tools for dosimetry also in routine applications. Many institutions dealing with radiation protection, however, have small groups which, in addition to their routine work, often cannot afford to specialise in the field of computational dosimetry. This means that not only experts but increasingly also casual users employ complicated computational tools such as general-purpose transport codes. This massive use of computer programmes in radiation protection and dosimetry applications motivated the Concerted Action Investigation and Quality Assurance of Numerical Methods in Radiation Protection Dosimetry of the 4th framework programme of the European Commission to prepare, distribute and evaluate a questionnaire on the use of such codes. A significant number of scientists from nearly all the countries of the European Community (and some countries outside Europe) contributed to the questionnaire, that allowed to obtain a satisfactory overview of the state of the art in this field. The results obtained from the questionnaire and summarised in the present Report are felt to be indicative of the situation of using sophisticated computer codes within the European Community although the group of participating scientist may not be a representative sample in a strict statistical sense [it

  15. Computational Exposure Science: An Emerging Discipline to ...

    Science.gov (United States)

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source

  16. Effort-Based Decision-Making in Schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Moran, Erin K; Barch, Deanna M

    2018-08-01

    Motivational impairment has long been associated with schizophrenia but the underlying mechanisms are not clearly understood. Recently, a small but growing literature has suggested that aberrant effort-based decision-making may be a potential contributory mechanism for motivational impairments in psychosis. Specifically, multiple reports have consistently demonstrated that individuals with schizophrenia are less willing than healthy controls to expend effort to obtain rewards. Further, this effort-based decision-making deficit has been shown to correlate with severity of negative symptoms and level of functioning, in many but not all studies. In the current review, we summarize this literature and discuss several factors that may underlie aberrant effort-based decision-making in schizophrenia.

  17. Fusion energy division computer systems network

    International Nuclear Information System (INIS)

    Hammons, C.E.

    1980-12-01

    The Fusion Energy Division of the Oak Ridge National Laboratory (ORNL) operated by Union Carbide Corporation Nuclear Division (UCC-ND) is primarily involved in the investigation of problems related to the use of controlled thermonuclear fusion as an energy source. The Fusion Energy Division supports investigations of experimental fusion devices and related fusion theory. This memo provides a brief overview of the computing environment in the Fusion Energy Division and the computing support provided to the experimental effort and theory research

  18. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  19. Kuwait poised for massive well kill effort

    Energy Technology Data Exchange (ETDEWEB)

    1991-04-08

    This paper reports that full scale efforts to extinguish Kuwait's oil well fires are to begin. The campaign to combat history's worst oil fires, originally expected to begin in mid-March, has been hamstrung by logistical problems, including delays in equipment deliveries caused by damage to Kuwait's infrastructure. Meantime, production from a key field off Kuwait--largely unaffected by the war--is expected to resume in May, but Kuwaiti oil exports will still be hindered by damaged onshore facilities. In addition, Kuwait is lining up equipment and personnel to restore production from its heavily damaged oil fields. Elsewhere in the Persian Gulf, Saudi Arabia reports progress in combating history's worst oil spills but acknowledges a continuing threat.

  20. Innovative Partnerships Assist Community College Computing Programs.

    Science.gov (United States)

    O'Banion, Terry

    1987-01-01

    Relates efforts of major corporations in providing assistance to community college computing programs. Explains the goals of the League for Innovation in the Community College, a consortium of 19 community colleges, and cites examples of collaborative projects. (ML)

  1. Dopamine, behavioral economics, and effort

    Directory of Open Access Journals (Sweden)

    John D Salamone

    2009-09-01

    Full Text Available Abstract. There are numerous problems with the hypothesis that brain dopamine (DA systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Research and theory related to the functions of mesolimbic DA are undergoing a substantial conceptual restructuring, with the traditional emphasis on hedonia and primary reward yielding to other concepts and lines of inquiry. The present review is focused upon the involvement of nucleus accumbens DA in behavioral activation and effort-related processes. Viewed from the framework of behavioral economics, the effects of accumbens DA depletions and antagonism on food-reinforced behavior are highly dependent upon the work requirements of the instrumental task, and DA depleted rats are more sensitive to increases in response costs (i.e., ratio requirements. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related choice behavior. Rats with accumbens DA depletions or antagonism reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead these rats select a less-effortful type of food-seeking behavior. Nucleus accumbens DA and adenosine interact in the regulation of effort-related functions, and other brain structures (anterior cingulate cortex, amygdala, ventral pallidum also are involved. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue or anergia in depression and other neurological disorders.

  2. What makes a reach movement effortful? Physical effort discounting supports common minimization principles in decision making and motor control.

    Directory of Open Access Journals (Sweden)

    Pierre Morel

    2017-06-01

    Full Text Available When deciding between alternative options, a rational agent chooses on the basis of the desirability of each outcome, including associated costs. As different options typically result in different actions, the effort associated with each action is an essential cost parameter. How do humans discount physical effort when deciding between movements? We used an action-selection task to characterize how subjective effort depends on the parameters of arm transport movements and controlled for potential confounding factors such as delay discounting and performance. First, by repeatedly asking subjects to choose between 2 arm movements of different amplitudes or durations, performed against different levels of force, we identified parameter combinations that subjects experienced as identical in effort (isoeffort curves. Movements with a long duration were judged more effortful than short-duration movements against the same force, while movement amplitudes did not influence effort. Biomechanics of the movements also affected effort, as movements towards the body midline were preferred to movements away from it. Second, by introducing movement repetitions, we further determined that the cost function for choosing between effortful movements had a quadratic relationship with force, while choices were made on the basis of the logarithm of these costs. Our results show that effort-based action selection during reaching cannot easily be explained by metabolic costs. Instead, force-loaded reaches, a widely occurring natural behavior, imposed an effort cost for decision making similar to cost functions in motor control. Our results thereby support the idea that motor control and economic choice are governed by partly overlapping optimization principles.

  3. "I put in effort, therefore I am passionate": Investigating the path from effort to passion in entrepreneurship

    OpenAIRE

    Gielnik, Michael Marcus; Spitzmuller, Matthias; Schmitt, Antje; Klemann, Katharina; Frese, Michael

    2015-01-01

    Most theoretical frameworks in entrepreneurship emphasize that entrepreneurial passion drives entrepreneurial effort. We hypothesize that the reverse effect is also true, and investigate changes in passion as an outcome of effort. Based on theories of self-regulation and self-perception, we hypothesize that making new venture progress and free choice are two factors that help to explain why and under which conditions entrepreneurial effort affects entrepreneurial passion. We undertook two stu...

  4. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  5. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    Science.gov (United States)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  6. Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)

  7. Effect of effort-reward imbalance and burnout on infection control among Ecuadorian nurses.

    Science.gov (United States)

    Colindres, C V; Bryce, E; Coral-Rosero, P; Ramos-Soto, R M; Bonilla, F; Yassi, A

    2018-06-01

    Nurses are frequently exposed to transmissible infections, yet adherence to infection control measures is suboptimal. There has been inadequate research into how the psychosocial work environment affects compliance with infection control measures, especially in low- and middle-income countries. To examine the association between effort-reward imbalance, burnout and adherence to infection control measures among nurses in Ecuador. A cross-sectional study linking psychosocial work environment indicators to infection control adherence. The study was conducted among 333 nurses in four Ecuadorian hospitals. Self-administered questionnaires assessed demographic variables, perceived infection risk, effort-reward imbalance, burnout and infection control adherence. Increased effort-reward imbalance was found to be a unique incremental predictor of exposure to burnout, and burnout was a negative unique incremental predictor of nurses' self-reported adherence with infection control measures. Results suggest an effort-reward imbalance-burnout continuum, which, at higher levels, contributes to reduce adherence to infection control. The Ecuadorean government has made large efforts to improve universal access to health care, yet this study suggests that workplace demands on nurses remain problematic. This study highlights the contribution of effort-reward-imbalance-burnout continuum to the chain of infection by decreased adherence to infection control of nurses. Health authorities should closely monitor the effect of new policies on psychosocial work environment, especially when expanding services and increasing public accessibility with limited resources. Additionally, organizational and psychosocial interventions targeting effort-reward imbalance and burnout in nurses should be considered part of a complete infection prevention and control strategy. Further study is warranted to identify interventions that best ameliorate effort-reward imbalance and burnout in low- and middle

  8. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  9. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  10. Multidisciplinary Efforts Driving Translational Theranostics

    Science.gov (United States)

    Hu, Tony Y.

    2014-01-01

    This themed issue summarizes significant efforts aimed at using “biological language” to discern between “friends” and “foes” in the context of theranostics for true clinical application. It is expected that the success of theranostics depends on multidisciplinary efforts, combined to expedite our understanding of host responses to “customized” theranostic agents and formulating individualized therapies. PMID:25285169

  11. Carbon emissions in China: How far can new efforts bend the curve?

    International Nuclear Information System (INIS)

    Zhang, Xiliang; Karplus, Valerie J.; Qi, Tianyu; Zhang, Da; He, Jiankun

    2016-01-01

    While China is on track to meet its global climate commitments through 2020, China's post-2020 CO_2 emissions trajectory is highly uncertain, with projections varying widely across studies. Over the past year, the Chinese government has announced new policy directives to deepen economic reform, to protect the environment, and to limit fossil energy use in China. To evaluate how new policy directives could affect energy and climate change outcomes, we simulate two levels of policy effort—a continued effort scenario that extends current policies beyond 2020 and an accelerated effort scenario that reflects newly announced policies—on the evolution of China's energy and economic system over the next several decades. We perform simulations using the China-in-Global Energy Model, C-GEM, a bespoke recursive-dynamic computable general equilibrium model with global coverage and detailed calibration of China's economy and future trends. Importantly, we find that both levels of policy effort would bend down the CO_2 emissions trajectory before 2050 without undermining economic development. Specifically, in the accelerated effort scenario, we find that coal use peaks around 2020, and CO_2 emissions level off around 2030 at 10 bmt, without undermining continued economic growth consistent with China reaching the status of a “well-off society” by 2050. - Highlights: • We develop a simulation model that captures energy system and technological detail in China. • We simulate China's recently announced climate and energy policies. • New policies in China are consistent with peak coal around 2020, and peak CO_2 emissions around 2030. • New policies cause modest leakage of coal use outside of China, especially to Southeast Asia.

  12. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  13. Current-voltage curves for molecular junctions computed using all-electron basis sets

    International Nuclear Information System (INIS)

    Bauschlicher, Charles W.; Lawson, John W.

    2006-01-01

    We present current-voltage (I-V) curves computed using all-electron basis sets on the conducting molecule. The all-electron results are very similar to previous results obtained using effective core potentials (ECP). A hybrid integration scheme is used that keeps the all-electron calculations cost competitive with respect to the ECP calculations. By neglecting the coupling of states to the contacts below a fixed energy cutoff, the density matrix for the core electrons can be evaluated analytically. The full density matrix is formed by adding this core contribution to the valence part that is evaluated numerically. Expanding the definition of the core in the all-electron calculations significantly reduces the computational effort and, up to biases of about 2 V, the results are very similar to those obtained using more rigorous approaches. The convergence of the I-V curves and transmission coefficients with respect to basis set is discussed. The addition of diffuse functions is critical in approaching basis set completeness

  14. Effort levels of the partners in networked manufacturing

    Science.gov (United States)

    Chai, G. R.; Cai, Z.; Su, Y. N.; Zong, S. L.; Zhai, G. Y.; Jia, J. H.

    2017-08-01

    Compared with traditional manufacturing mode, could networked manufacturing improve effort levels of the partners? What factors will affect effort level of the partners? How to encourage the partners to improve their effort levels? To answer these questions, we introduce network effect coefficient to build effort level model of the partners in networked manufacturing. The results show that (1) with the increase of the network effect in networked manufacturing, the actual effort level can go beyond the ideal level of traditional manufacturing. (2) Profit allocation based on marginal contribution rate would help improve effort levels of the partners in networked manufacturing. (3) The partners in networked manufacturing who wishes to have a larger distribution ratio must make a higher effort level, and enterprises with insufficient effort should be terminated in networked manufacturing.

  15. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  16. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  17. Audit Report "Department of Energy Efforts to Manage Information Technology Resources in an Energy-Efficient and Environmentally Responsible Manner"

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-05-01

    The American Recovery and Reinvestment Act of 2009 emphasizes energy efficiency and conservation as critical to the Nation's economic vitality; its goal of reducing dependence on foreign energy sources; and, related efforts to improve the environment. The Act highlights the significant use of various forms of energy in the Federal sector and promotes efforts to improve the energy efficiency of Federal operations. One specific area of interest is the increasing demand for Federal sector computing resources and the corresponding increase in energy use, with both cost and environmental implications. The U.S. Environmental Protection Agency reported that, without aggressive conservation measures, data center energy consumption alone is expected to double over the next five years. In our report on Management of the Department's Data Centers at Contractor Sites (DOE/IG-0803, October 2008) we concluded that the Department of Energy had not always improved the efficiency of its contractor data centers even when such modifications were possible and practical. Despite its recognized energy conservation leadership role, the Department had not always taken advantage of opportunities to reduce energy consumption associated with its information technology resources. Nor, had it ensured that resources were managed in a way that minimized impact on the environment. In particular: (1) The seven Federal and contractor sites included in our review had not fully reduced energy consumption through implementation of power management settings on their desktop and laptop computers; and, as a consequence, spent $1.6 million more on energy costs than necessary in Fiscal Year 2008; (2) None of the sites reviewed had taken advantage of opportunities to reduce energy consumption, enhance cyber security, and reduce costs available through the use of techniques, such as 'thin-client computing' in their unclassified environments; and, (3) Sites had not always taken the

  18. Standardization efforts in IP telephony

    Science.gov (United States)

    Sengodan, Senthil; Bansal, Raj

    1999-11-01

    The recent interest in IP telephony has led to a tremendous increase of standardization activities in the area. The three main standards bodies in the area of IP telephony are the International Telecommunication Union's (ITU-T) Study Group (SG) 16, the Internet Engineering Task Force (IETF) and the European Telecommunication Standards Institute's (ETSI) TIPHON project. In addition, forums such as the International Multimedia Teleconferencing Consortium (IMTC), the Intelligent Network Forum (INF), the International Softswitch Consortium (ISC), the Electronic Computer Telephony Forum (ECTF), and the MIT's Internet Telephony Consortium (ITC) are looking into various other aspects that aim at the growth of this industry. This paper describes the main tasks (completed and in progress) undertaken by these organizations. In describing such work, an overview of the underlying technology is also provided.

  19. Is overall similarity classification less effortful than single-dimension classification?

    Science.gov (United States)

    Wills, Andy J; Milton, Fraser; Longmore, Christopher A; Hester, Sarah; Robinson, Jo

    2013-01-01

    It is sometimes argued that the implementation of an overall similarity classification is less effortful than the implementation of a single-dimension classification. In the current article, we argue that the evidence securely in support of this view is limited, and report additional evidence in support of the opposite proposition--overall similarity classification is more effortful than single-dimension classification. Using a match-to-standards procedure, Experiments 1A, 1B and 2 demonstrate that concurrent load reduces the prevalence of overall similarity classification, and that this effect is robust to changes in the concurrent load task employed, the level of time pressure experienced, and the short-term memory requirements of the classification task. Experiment 3 demonstrates that participants who produced overall similarity classifications from the outset have larger working memory capacities than those who produced single-dimension classifications initially, and Experiment 4 demonstrates that instructions to respond meticulously increase the prevalence of overall similarity classification.

  20. Reduction of community alcohol problems: computer simulation experiments in three counties.

    Science.gov (United States)

    Holder, H D; Blose, J O

    1987-03-01

    A series of alcohol abuse prevention strategies was evaluated using computer simulation for three counties in the United States: Wake County, North Carolina, Washington County, Vermont and Alameda County, California. A system dynamics model composed of a network of interacting variables was developed for the pattern of alcoholic beverage consumption in a community. The relationship of community drinking patterns to various stimulus factors was specified in the model based on available empirical research. Stimulus factors included disposable income, alcoholic beverage prices, advertising exposure, minimum drinking age and changes in cultural norms. After a generic model was developed and validated on the national level, a computer-based system dynamics model was developed for each county, and a series of experiments was conducted to project the potential impact of specific prevention strategies. The project concluded that prevention efforts can both lower current levels of alcohol abuse and reduce projected increases in alcohol-related problems. Without such efforts, already high levels of alcohol-related family disruptions in the three counties could be expected to rise an additional 6% and drinking-related work problems 1-5%, over the next 10 years after controlling for population growth. Of the strategies tested, indexing the price of alcoholic beverages to the consumer price index in conjunction with the implementation of a community educational program with well-defined target audiences has the best potential for significant problem reduction in all three counties.

  1. Measuring collections effort improves cash performance.

    Science.gov (United States)

    Shutts, Joe

    2009-09-01

    Having a satisfied work force can lead to an improved collections effort. Hiring the right people and training them ensures employee engagement. Measuring collections effort and offering incentives is key to revenue cycle success.

  2. Current status of the MPEG-4 standardization effort

    Science.gov (United States)

    Anastassiou, Dimitris

    1994-09-01

    The Moving Pictures Experts Group (MPEG) of the International Standardization Organization has initiated a standardization effort, known as MPEG-4, addressing generic audiovisual coding at very low bit-rates (up to 64 kbits/s) with applications in videotelephony, mobile audiovisual communications, video database retrieval, computer games, video over Internet, remote sensing, etc. This paper gives a survey of the status of MPEG-4, including its planned schedule, and initial ideas about requirements and applications. A significant part of this paper is summarizing an incomplete draft version of a `requirements document' which presents specifications of desirable features on the video, audio, and system level of the forthcoming standard. Very low bit-rate coding algorithms are not described, because no endorsement of any particular algorithm, or class of algorithms, has yet been made by MPEG-4, and several seminars held concurrently with MPEG-4 meetings have not so far provided evidence that such high performance coding schemes are achievable.

  3. Resource scarcity, effort, and performance in physically demanding jobs: An evolutionary explanation.

    Science.gov (United States)

    Pitesa, Marko; Thau, Stefan

    2018-03-01

    Based on evolutionary theory, we predicted that cues of resource scarcity in the environment (e.g., news of droughts or food shortages) lead people to reduce their effort and performance in physically demanding work. We tested this prediction in a 2-wave field survey among employees and replicated it experimentally in the lab. In Study 1, employees who perceived resources in the environment to be scarce reported exerting less effort when their jobs involved much (but not little) physical work. In Study 2, participants who read that resources in the environment were scarce performed worse on a task demanding more (carrying books) but not less (transcribing book titles) physical work. This result was found even though better performance increased participants' chances of additional remuneration, and even though scarcity cues did not affect individuals' actual ability to meet their energy needs. We discuss implications for managing effort and performance, and the potential of evolutionary psychology to explain core organizational phenomena. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Grids in Europe - a computing infrastructure for science

    International Nuclear Information System (INIS)

    Kranzlmueller, D.

    2008-01-01

    Grids provide sheer unlimited computing power and access to a variety of resources to todays scientists. Moving from a research topic of computer science to a commodity tool for science and research in general, grid infrastructures are built all around the world. This talk provides an overview of the developments of grids in Europe, the status of the so-called national grid initiatives as well as the efforts towards an integrated European grid infrastructure. The latter, summarized under the title of the European Grid Initiative (EGI), promises a permanent and reliable grid infrastructure and its services in a way similar to research networks today. The talk describes the status of these efforts, the plans for the setup of this pan-European e-Infrastructure, and the benefits for the application communities. (author)

  5. 20 CFR 404.278 - Additional cost-of-living increase.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Additional cost-of-living increase. 404.278... DISABILITY INSURANCE (1950- ) Computing Primary Insurance Amounts Cost-Of-Living Increases § 404.278 Additional cost-of-living increase. (a) General. In addition to the cost-of-living increase explained in...

  6. Computational structural mechanics for engine structures

    Science.gov (United States)

    Chamis, C. C.

    1989-01-01

    The computational structural mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures. It is structured to mainly supplement, complement, and whenever possible replace, costly experimental efforts which are unavoidable during engineering research and development programs. Specific objectives include: investigate unique advantages of parallel and multiprocesses for: reformulating/solving structural mechanics and formulating/solving multidisciplinary mechanics and develop integrated structural system computational simulators for: predicting structural performances, evaluating newly developed methods, and for identifying and prioritizing improved/missing methods needed. Herein the CSM program is summarized with emphasis on the Engine Structures Computational Simulator (ESCS). Typical results obtained using ESCS are described to illustrate its versatility.

  7. Many Masses on One Stroke:. Economic Computation of Quark Propagators

    Science.gov (United States)

    Frommer, Andreas; Nöckel, Bertold; Güsken, Stephan; Lippert, Thomas; Schilling, Klaus

    The computational effort in the calculation of Wilson fermion quark propagators in Lattice Quantum Chromodynamics can be considerably reduced by exploiting the Wilson fermion matrix structure in inversion algorithms based on the non-symmetric Lanczos process. We consider two such methods: QMR (quasi minimal residual) and BCG (biconjugate gradients). Based on the decomposition M/κ = 1/κ-D of the Wilson mass matrix, using QMR, one can carry out inversions on a whole trajectory of masses simultaneously, merely at the computational expense of a single propagator computation. In other words, one has to compute the propagator corresponding to the lightest mass only, while all the heavier masses are given for free, at the price of extra storage. Moreover, the symmetry γ5M = M†γ5 can be used to cut the computational effort in QMR and BCG by a factor of two. We show that both methods then become — in the critical regime of small quark masses — competitive to BiCGStab and significantly better than the standard MR method, with optimal relaxation factor, and CG as applied to the normal equations.

  8. Greater effort increases perceived value in an invertebrate.

    Science.gov (United States)

    Czaczkes, Tomer J; Brandstetter, Birgit; di Stefano, Isabella; Heinze, Jürgen

    2018-05-01

    Expending effort is generally considered to be undesirable. However, both humans and vertebrates will work for a reward they could also get for free. Moreover, cues associated with high-effort rewards are preferred to low-effort associated cues. Many explanations for these counterintuitive findings have been suggested, including cognitive dissonance (self-justification) or a greater contrast in state (e.g., energy or frustration level) before and after an effort-linked reward. Here, we test whether effort expenditure also increases perceived value in ants, using both classical cue-association methods and pheromone deposition, which correlates with perceived value. In 2 separate experimental setups, we show that pheromone deposition is higher toward the reward that requires more effort: 47% more pheromone deposition was performed for rewards reached via a vertical runway (high effort) compared with ones reached via a horizontal runway (low effort), and deposition rates were 28% higher on rough (high effort) versus smooth (low effort) runways. Using traditional cue-association methods, 63% of ants trained on different surface roughness, and 70% of ants trained on different runway elevations, preferred the high-effort related cues on a Y maze. Finally, pheromone deposition to feeders requiring memorization of one path bifurcation was up to 29% higher than to an identical feeder requiring no learning. Our results suggest that effort affects value perception in ants. This effect may stem from a cognitive process, which monitors the change in a generalized hedonic state before and after reward. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Fundamental Drop Dynamics and Mass Transfer Experiments to Support Solvent Extraction Modeling Efforts

    International Nuclear Information System (INIS)

    Christensen, Kristi; Rutledge, Veronica; Garn, Troy

    2011-01-01

    In support of the Nuclear Energy Advanced Modeling Simulation Safeguards and Separations (NEAMS SafeSep) program, the Idaho National Laboratory (INL) worked in collaboration with Los Alamos National Laboratory (LANL) to further a modeling effort designed to predict mass transfer behavior for selected metal species between individual dispersed drops and a continuous phase in a two phase liquid-liquid extraction (LLE) system. The purpose of the model is to understand the fundamental processes of mass transfer that occur at the drop interface. This fundamental understanding can be extended to support modeling of larger LLE equipment such as mixer settlers, pulse columns, and centrifugal contactors. The work performed at the INL involved gathering the necessary experimental data to support the modeling effort. A custom experimental apparatus was designed and built for performing drop contact experiments to measure mass transfer coefficients as a function of contact time. A high speed digital camera was used in conjunction with the apparatus to measure size, shape, and velocity of the drops. In addition to drop data, the physical properties of the experimental fluids were measured to be used as input data for the model. Physical properties measurements included density, viscosity, surface tension and interfacial tension. Additionally, self diffusion coefficients for the selected metal species in each experimental solution were measured, and the distribution coefficient for the metal partitioning between phases was determined. At the completion of this work, the INL has determined the mass transfer coefficient and a velocity profile for drops rising by buoyancy through a continuous medium under a specific set of experimental conditions. Additionally, a complete set of experimentally determined fluid properties has been obtained. All data will be provided to LANL to support the modeling effort.

  10. How Consumer Trust in Financial Institutions Influences Relationships Between Knowledge, Cognitive Effort and Financial Healthiness

    DEFF Research Database (Denmark)

    Hansen, Torben

    2014-01-01

    Trust not only relates to customer trust in individual financial companies (i.e., narrow-scope trust) but also relates to the broader business context in which consumers carry out their financial decisions (i.e., broad-scope trust). Based on two surveys comprising 1,155 bank consumers and 764...... pension consumers, respectively, the results of this study indicate that broad-scope trust negatively moderates relations between knowledge and financial healthiness and between cognitive effort and financial healthiness. In addition, it is demonstrated that broad-scope trust negatively influences...... cognitive effort and positively influences financial healthiness....

  11. Nash Stability in Additively Separable Hedonic Games and Community Structures

    DEFF Research Database (Denmark)

    Olsen, Martin

    2009-01-01

      We prove that the problem of deciding whether a Nash stable   partition exists in an Additively Separable Hedonic Game is   NP-complete. We also show that the problem of deciding whether a   non trivial Nash stable partition exists in an   Additively Separable Hedonic Game with   non......-negative and symmetric   preferences is NP-complete. We motivate our study of the   computational complexity by linking Nash stable partitions in   Additively Separable Hedonic Games to community structures in   networks. Our results formally justify that computing community   structures in general is hard....

  12. Computational Amphiphilic Materials for Drug Delivery

    Directory of Open Access Journals (Sweden)

    Naresh eThota

    2015-10-01

    Full Text Available Amphiphilic materials can assemble into a wide variety of morphologies and have emerged as a novel class of candidates for drug delivery. Along with a large number of experiments reported, computational studies have been also conducted in this field. At an atomistic/molecular level, computations can facilitate quantitative understanding of experimental observations and secure fundamental interpretation of underlying phenomena. This review summarizes the recent computational efforts on amphiphilic copolymers and peptides for drug delivery. Atom-resolution and time-resolved insights are provided from bottom-up to microscopically elucidate the mechanisms of drug loading/release, which are indispensable in the rational screening and design of new amphiphiles for high-efficacy drug delivery.

  13. Computational design of RNAs with complex energy landscapes.

    Science.gov (United States)

    Höner zu Siederdissen, Christian; Hammer, Stefan; Abfalter, Ingrid; Hofacker, Ivo L; Flamm, Christoph; Stadler, Peter F

    2013-12-01

    RNA has become an integral building material in synthetic biology. Dominated by their secondary structures, which can be computed efficiently, RNA molecules are amenable not only to in vitro and in vivo selection, but also to rational, computation-based design. While the inverse folding problem of constructing an RNA sequence with a prescribed ground-state structure has received considerable attention for nearly two decades, there have been few efforts to design RNAs that can switch between distinct prescribed conformations. We introduce a user-friendly tool for designing RNA sequences that fold into multiple target structures. The underlying algorithm makes use of a combination of graph coloring and heuristic local optimization to find sequences whose energy landscapes are dominated by the prescribed conformations. A flexible interface allows the specification of a wide range of design goals. We demonstrate that bi- and tri-stable "switches" can be designed easily with moderate computational effort for the vast majority of compatible combinations of desired target structures. RNAdesign is freely available under the GPL-v3 license. Copyright © 2013 Wiley Periodicals, Inc.

  14. Biomechanical Comparison of Three Perceived Effort Set Shots in Team Handball Players.

    Science.gov (United States)

    Plummer, Hillary A; Gascon, Sarah S; Oliver, Gretchen D

    2017-01-01

    Plummer, HA, Gascon, SS, and Oliver, GD. Biomechanical comparison of three perceived effort set shots in team handball players. J Strength Cond Res 31(1): 80-87, 2017-Shoulder injuries are prevalent in the sport of team handball; however, no guidelines currently exist in the implementation of an interval throwing protocol for players returning from an upper extremity injury. These guidelines exist for the sport of baseball, but team handball may present additional challenges due to greater ball mass that must be accounted for. The purpose of this study was to examine kinematic differences in the team handball set shot at 50, 75, and 100% effort which are common throwing intensities in throwing protocols. Eleven male team handball players (23.09 ± 3.05 years; 185.12 ± 8.33 cm; 89.65 ± 12.17 kg) volunteered. An electromagnetic tracking system was used to collect kinematic data at the pelvis, trunk, scapula, and shoulder. Kinematic differences at the shoulder, trunk, and pelvis were observed across effort levels throughout the set shot with most occurring at ball release and maximum internal rotation. Significant differences in ball speed were observed between all 3 effort level shots (p handball players are able to gauge the effort at which they shoot; however, it cannot be assumed that these speeds will be at a certain percentage of their maximum. The results of this study provide valuable evidence that can be used to prepare a team handball player to return to throwing activities.

  15. Hiding effort to gain a competitive advantage: Evidence from China.

    Science.gov (United States)

    Zhao, Li; Heyman, Gail D

    2018-06-01

    Previous studies with Western populations have shown that adolescents' tendency to downplay their academic effort is affected by two kinds of motives: ability-related motives (e.g., to appear competent) and social approval motives (e.g., to be popular). In this research, we test for the presence of additional competition-related motives in China, a culture placing strong emphasis on academic competition. Study 1 (N = 150) showed that, in response to a scenario in which a hard-working high-school junior hid effort from classmates, the most highly endorsed explanation was "to influence others to work less hard to maintain a competitive advantage." Study 2 (N = 174) revealed that competition-related explanations were endorsed relatively more often when the speaker and audience had similar academic rankings. This tendency was most evident when both speaker and audience were top performers, and when this was the case, participants' desire to demonstrate superiority over others was a positive predictor of endorsement of competition-related motives. Study 3 (N = 137) verified that competition-related motives were more strongly endorsed among Chinese participants than U.S. These results suggest that at least in cultures that emphasize academic competition and in contexts where competition is salient, hiding effort is often about attempting to gain strategic advantage. © 2016 International Union of Psychological Science.

  16. Listening Effort With Cochlear Implant Simulations

    NARCIS (Netherlands)

    Pals, Carina; Sarampalis, Anastasios; Başkent, Deniz

    2013-01-01

    Purpose: Fitting a cochlear implant (CI) for optimal speech perception does not necessarily optimize listening effort. This study aimed to show that listening effort may change between CI processing conditions for which speech intelligibility remains constant. Method: Nineteen normal-hearing

  17. Additive manufacturing: state-of-the-art and application framework

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; de Senzi Zancul, Eduardo; Gonçalves Mançanares, Cauê

    2017-01-01

    Additive manufacturing encompasses a class of production processes with increasing applications indifferent areas and supply chains. Due to its flexibility for production in small batches and the versatilityof materials and geometries, this technology is recognized as being capable...... of revolutionizing theproduction processes as well as changing production strategies that are currently employed. However,there are different technologies under the generic label of additive manufacturing, materials and applicationareas with different requirements. Given the growing importance of additive...... manufacturingas a production process, and also considering the need to have a better insight into the potential applicationsfor driving research and development efforts, this article presents a proposal of organizationfor additive manufacturing applications in seven areas. Additionally, the article provides...

  18. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  19. Xenon plasma with caesium as additive

    International Nuclear Information System (INIS)

    Stojilkovic, S.M.; Novakovic, N.V.; Zivkovic, L.M.

    1986-01-01

    The concentration dependence of xenon plasma with cesium as additive in the temperature range of 2000 K to 20,000 K is analyzed. Plasma is considered as weakly nonideal in complete local thermodynamic equilibrium and the interaction between plasma and vessel walls is not taken into account. The values of some of the parameters for nonideality of plasma with 1% of cesium (γ=0.01010) and 10% of cesium (γ=0.11111) are computed, for an initial pressure in plasma of p 0 =13,000 Pa and initial temperature T 0 =1000 K. The ratio of electric conductivity of plasma computed by Lorentz's formula and electric conductivity computed by Spitzer's formula in the same temperature interval is also analyzed. (author) 5 figs., 2 tabs., 16 refs

  20. The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)

    Science.gov (United States)

    2017-09-01

    Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary

  1. The CAIN computer code for the generation of MABEL input data sets: a user's manual

    International Nuclear Information System (INIS)

    Tilley, D.R.

    1983-03-01

    CAIN is an interactive FORTRAN computer code designed to overcome the substantial effort involved in manually creating the thermal-hydraulics input data required by MABEL-2. CAIN achieves this by processing output from either of the whole-core codes, RELAP or TRAC, interpolating where necessary, and by scanning RELAP/TRAC output in order to generate additional information. This user's manual describes the actions required in order to create RELAP/TRAC data sets from magnetic tape, to create the other input data sets required by CAIN, and to operate the interactive command procedure for the execution of CAIN. In addition, the CAIN code is described in detail. This programme of work is part of the Nuclear Installations Inspectorate (NII)'s contribution to the United Kingdom Atomic Energy Authority's independent safety assessment of pressurized water reactors. (author)

  2. Adults with autism spectrum disorders exhibit decreased sensitivity to reward parameters when making effort-based decisions

    Directory of Open Access Journals (Sweden)

    Damiano Cara R

    2012-05-01

    Full Text Available Abstract Background Efficient effort expenditure to obtain rewards is critical for optimal goal-directed behavior and learning. Clinical observation suggests that individuals with autism spectrum disorders (ASD may show dysregulated reward-based effort expenditure, but no behavioral study to date has assessed effort-based decision-making in ASD. Methods The current study compared a group of adults with ASD to a group of typically developing adults on the Effort Expenditure for Rewards Task (EEfRT, a behavioral measure of effort-based decision-making. In this task, participants were provided with the probability of receiving a monetary reward on a particular trial and asked to choose between either an “easy task” (less motoric effort for a small, stable reward or a “hard task” (greater motoric effort for a variable but consistently larger reward. Results Participants with ASD chose the hard task more frequently than did the control group, yet were less influenced by differences in reward value and probability than the control group. Additionally, effort-based decision-making was related to repetitive behavior symptoms across both groups. Conclusions These results suggest that individuals with ASD may be more willing to expend effort to obtain a monetary reward regardless of the reward contingencies. More broadly, results suggest that behavioral choices may be less influenced by information about reward contingencies in individuals with ASD. This atypical pattern of effort-based decision-making may be relevant for understanding the heightened reward motivation for circumscribed interests in ASD.

  3. Computational nuclear quantum many-body problem: The UNEDF project

    Science.gov (United States)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  4. Text genres and registers the computation of linguistic features

    CERN Document Server

    Fang, Chengyu Alex

    2015-01-01

    This book is a description of some of the most recent advances in text classification as part of a concerted effort to achieve computer understanding of human language. In particular, it addresses state-of-the-art developments in the computation of higher-level linguistic features, ranging from etymology to grammar and syntax for the practical task of text classification according to genres, registers and subject domains. Serving as a bridge between computational methods and sophisticated linguistic analysis, this book will be of particular interest to academics and students of computational linguistics as well as professionals in natural language engineering.

  5. Cloud computing in medical imaging.

    Science.gov (United States)

    Kagadis, George C; Kloukinas, Christos; Moore, Kevin; Philbin, Jim; Papadimitroulas, Panagiotis; Alexakos, Christos; Nagy, Paul G; Visvikis, Dimitris; Hendee, William R

    2013-07-01

    Over the past century technology has played a decisive role in defining, driving, and reinventing procedures, devices, and pharmaceuticals in healthcare. Cloud computing has been introduced only recently but is already one of the major topics of discussion in research and clinical settings. The provision of extensive, easily accessible, and reconfigurable resources such as virtual systems, platforms, and applications with low service cost has caught the attention of many researchers and clinicians. Healthcare researchers are moving their efforts to the cloud, because they need adequate resources to process, store, exchange, and use large quantities of medical data. This Vision 20/20 paper addresses major questions related to the applicability of advanced cloud computing in medical imaging. The paper also considers security and ethical issues that accompany cloud computing.

  6. Computational Lipidomics and Lipid Bioinformatics: Filling In the Blanks.

    Science.gov (United States)

    Pauling, Josch; Klipp, Edda

    2016-12-22

    Lipids are highly diverse metabolites of pronounced importance in health and disease. While metabolomics is a broad field under the omics umbrella that may also relate to lipids, lipidomics is an emerging field which specializes in the identification, quantification and functional interpretation of complex lipidomes. Today, it is possible to identify and distinguish lipids in a high-resolution, high-throughput manner and simultaneously with a lot of structural detail. However, doing so may produce thousands of mass spectra in a single experiment which has created a high demand for specialized computational support to analyze these spectral libraries. The computational biology and bioinformatics community has so far established methodology in genomics, transcriptomics and proteomics but there are many (combinatorial) challenges when it comes to structural diversity of lipids and their identification, quantification and interpretation. This review gives an overview and outlook on lipidomics research and illustrates ongoing computational and bioinformatics efforts. These efforts are important and necessary steps to advance the lipidomics field alongside analytic, biochemistry, biomedical and biology communities and to close the gap in available computational methodology between lipidomics and other omics sub-branches.

  7. Computing possibilities in the mid 1990s

    International Nuclear Information System (INIS)

    Nash, T.

    1988-09-01

    This paper describes the kind of computing resources it may be possible to make available for experiments in high energy physics in the mid and late 1990s. We outline some of the work going on today, particularly at Fermilab's Advanced Computer Program, that projects to the future. We attempt to define areas in which coordinated R and D efforts should prove fruitful to provide for on and off-line computing in the SSC era. Because of extraordinary components anticipated from industry, we can be optimistic even to the level of predicting million VAX equivalent on-line multiprocessor/data acquisition systems for SSC detectors. Managing this scale of computing will require a new approach to large hardware and software systems. 15 refs., 6 figs

  8. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  9. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  10. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  11. Effort-reward imbalance in the school setting: associations with somatic pain and self-rated health.

    Science.gov (United States)

    Låftman, Sara Brolin; Modin, Bitte; Östberg, Viveca; Hoven, Hanno; Plenty, Stephanie

    2015-03-01

    According to the workplace theory of effort-reward imbalance (ERI), individuals who perceive a lack of reciprocity between their effort spent at work and the rewards received in turn are at an increased risk of stress-related ill-health. It is also assumed that being overcommitted to work is linked to an increased risk of stress-related ill-health. This study applies the effort-reward imbalance model to the school setting. It aims to analyse the associations that effort-reward imbalance and overcommitment share with somatic pain and self-rated health among adolescents. Data are from the School Stress and Support Study (TriSSS), involving students in grades 8 and 9 (ages 14-16 years) in two schools in Stockholm, Sweden, during 2010 (n=403). Information on effort-reward imbalance and health outcomes was gathered from self-report questionnaires. An adjusted short version of ERI was used. Factor analysis showed that extrinsic effort, reward and overcommitment constitute three distinct dimensions. The designed measures demonstrated sound psychometric properties both for the full sample and for subgroups. Ordered logistic regressions were conducted. The analyses showed that low reward and higher overcommitment were associated with greater somatic pain and poorer self-rated health. Furthermore, effort-reward imbalance was linked with an elevated risk of somatic pain and poorer self-rated health. Students are more likely to experience stress-related ill-health when they perceive an imbalance between their effort and rewards. In addition, high overcommitment is associated with an increased risk of ill-health among students. © 2014 the Nordic Societies of Public Health.

  12. [Analysis and evaluation of the visual effort in remote-control public traffic operators working with computer-based equipments].

    Science.gov (United States)

    Gullà, F; Zambelli, P; Bergamaschi, A; Piccoli, B

    2007-01-01

    The aim of this study is the objective evaluation of the visual effort in 6 public traffic controllers (4 male, 2 female, mean age 29,6), by means of electronic equipment. The electronic equipment quantify the observation distance and the observation time for each controller's occupational visual field. The quantification of these parameters is obtained by the emission of ultrasound at 40 KHz from an emission sensor (placed by the VDT screen) and the ultrasound reception by means of a receiving sensor (placed on the operator's head). The travelling time of the ultrasound (US), as the air speed is known and costant (about 340 m/s), it is used to calculate the distance between the emitting and the receiving sensor. The results show that the visual acuity required is of average level, while accommodation's and convergence's effort vary from average to intense (depending on the visual characteristics of the operator considered), ranging from 26,41 and 43,92% of accommodation and convergence available in each operator. The time actually spent in "near observation within the c.v.p." (Tscr) was maintained in a range from 2h 54' and 4h 05'.

  13. Validation of a Computational Fluid Dynamics (CFD) Code for Supersonic Axisymmetric Base Flow

    Science.gov (United States)

    Tucker, P. Kevin

    1993-01-01

    The ability to accurately and efficiently calculate the flow structure in the base region of bodies of revolution in supersonic flight is a significant step in CFD code validation for applications ranging from base heating for rockets to drag for protectives. The FDNS code is used to compute such a flow and the results are compared to benchmark quality experimental data. Flowfield calculations are presented for a cylindrical afterbody at M = 2.46 and angle of attack a = O. Grid independent solutions are compared to mean velocity profiles in the separated wake area and downstream of the reattachment point. Additionally, quantities such as turbulent kinetic energy and shear layer growth rates are compared to the data. Finally, the computed base pressures are compared to the measured values. An effort is made to elucidate the role of turbulence models in the flowfield predictions. The level of turbulent eddy viscosity, and its origin, are used to contrast the various turbulence models and compare the results to the experimental data.

  14. High-performance computing in seismology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  15. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  16. The Computational Infrastructure for Geodynamics as a Community of Practice

    Science.gov (United States)

    Hwang, L.; Kellogg, L. H.

    2016-12-01

    Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.

  17. Myocardial perfusion with multi-detector computed tomography: quantitative evaluation

    International Nuclear Information System (INIS)

    Carrascosa, Patricia M.; Vallejos, J.; Capunay, Carlos M.; Deviggiano, A.; Carrascosa, Jorge M.

    2007-01-01

    The objective of this work is to evaluate the skill of multidetector computer tomography, to quantify the different patterns of intensification during the evaluation of the myocardial perfusion. 45 patients were studied with suspicion of cardiovascular disease. Multi-detector computed tomography was utilized on patients at rest and in effort with pharmacological stress, after the administration of dipyridamole. Also they were evaluated using nuclear medicine [es

  18. Cerebral blood flow, fatigue, mental effort, and task performance in offices with two different pollution loads

    DEFF Research Database (Denmark)

    Nishihara, Naoe; Wargocki, Pawel; Tanabe, Shin-ichi

    2014-01-01

    The effects of indoor air quality on symptoms, perceptions, task performance, cerebral blood flow, fatigue, and mental effort of individuals working in an office were investigated. Twenty-four right-handed Danish female subjects in an office were exposed in groups of two at a time to two air...... pollution levels created by placing or removing a pollution source (i.e. a used carpet) behind a screen. During the exposure, the subjects performed four different office tasks presented on a computer monitor. The tasks were performed at two paces: normal and maximum. When the pollution source was present...... any effects caused by modifying pollution exposure, they were well correlated with increased mental effort when the tasks were performed at maximum pace and subjectively reported fatigue, which increased during the course of exposure, respectively....

  19. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  20. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  1. Structure, dynamics, and function of the monooxygenase P450 BM-3: insights from computer simulations studies

    International Nuclear Information System (INIS)

    Roccatano, Danilo

    2015-01-01

    The monooxygenase P450 BM-3 is a NADPH-dependent fatty acid hydroxylase enzyme isolated from soil bacterium Bacillus megaterium. As a pivotal member of cytochrome P450 superfamily, it has been intensely studied for the comprehension of structure–dynamics–function relationships in this class of enzymes. In addition, due to its peculiar properties, it is also a promising enzyme for biochemical and biomedical applications. However, despite the efforts, the full understanding of the enzyme structure and dynamics is not yet achieved. Computational studies, particularly molecular dynamics (MD) simulations, have importantly contributed to this endeavor by providing new insights at an atomic level regarding the correlations between structure, dynamics, and function of the protein. This topical review summarizes computational studies based on MD simulations of the cytochrome P450 BM-3 and gives an outlook on future directions. (topical review)

  2. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  3. Interests, Effort, Achievement and Vocational Preference.

    Science.gov (United States)

    Sjoberg, L.

    1984-01-01

    Relationships between interest in natural sciences and technology and perceived ability, success, and invested effort were studied in Swedish secondary school students. Interests were accounted for by logical orientation and practical value. Interests and grades were strongly correlated, but correlations between interests and effort and vocational…

  4. Has Malaysia's antidrug effort been effective?

    Science.gov (United States)

    Scorzelli, J F

    1992-01-01

    It is a common belief that a massive effort in law enforcement, preventive education and rehabilitation will result in the elimination of a country's drug problem. Based on this premise. Malaysia in 1983 implemented such a multifaceted anti-drug strategy, and the results of a 1987 study by the author suggested that Malaysia's effort had begun to contribute to a steady decrease in the number of identified drug abusers. Although the number of drug-addicted individuals declined, the country's recidivism rates were still high. Because of this high relapse rate, Malaysia expanded their rehabilitation effort and developed a community transition program. In order to determine the impact of these changes on the country's battle against drug abuse, a follow-up study was conducted in 1990. The results of this study did not clearly demonstrate that the Malaysian effort had been successful in eliminating the problem of drug abuse, and raised some questions concerning the effectiveness of the country's drug treatment programs.

  5. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  6. Solar Flare Prediction Science-to-Operations: the ESA/SSA SWE A-EFFort Service

    Science.gov (United States)

    Georgoulis, Manolis K.; Tziotziou, Konstantinos; Themelis, Konstantinos; Magiati, Margarita; Angelopoulou, Georgia

    2016-07-01

    We attempt a synoptical overview of the scientific origins of the Athens Effective Solar Flare Forecasting (A-EFFort) utility and the actions taken toward transitioning it into a pre-operational service of ESA's Space Situational Awareness (SSA) Programme. The preferred method for solar flare prediction, as well as key efforts to make it function in a fully automated environment by coupling calculations with near-realtime data-downloading protocols (from the Solar Dynamics Observatory [SDO] mission), pattern recognition (solar active-region identification) and optimization (magnetic connectivity by simulated annealing) will be highlighted. In addition, the entire validation process of the service will be described, with its results presented. We will conclude by stressing the need for across-the-board efforts and synergistic work in order to bring science of potentially limited/restricted interest into realizing a much broader impact and serving the best public interests. The above presentation was partially supported by the ESA/SSA SWE A-EFFort project, ESA Contract No. 4000111994/14/D/MRP. Special thanks go to the ESA Project Officers R. Keil, A. Glover, and J.-P. Luntama (ESOC), M. Bobra and C. Balmer of the SDO/HMI team at Stanford University, and M. Zoulias at the RCAAM of the Academy of Athens for valuable technical help.

  7. Advanced Material Strategies for Next-Generation Additive Manufacturing.

    Science.gov (United States)

    Chang, Jinke; He, Jiankang; Mao, Mao; Zhou, Wenxing; Lei, Qi; Li, Xiao; Li, Dichen; Chua, Chee-Kai; Zhao, Xin

    2018-01-22

    Additive manufacturing (AM) has drawn tremendous attention in various fields. In recent years, great efforts have been made to develop novel additive manufacturing processes such as micro-/nano-scale 3D printing, bioprinting, and 4D printing for the fabrication of complex 3D structures with high resolution, living components, and multimaterials. The development of advanced functional materials is important for the implementation of these novel additive manufacturing processes. Here, a state-of-the-art review on advanced material strategies for novel additive manufacturing processes is provided, mainly including conductive materials, biomaterials, and smart materials. The advantages, limitations, and future perspectives of these materials for additive manufacturing are discussed. It is believed that the innovations of material strategies in parallel with the evolution of additive manufacturing processes will provide numerous possibilities for the fabrication of complex smart constructs with multiple functions, which will significantly widen the application fields of next-generation additive manufacturing.

  8. Advanced Material Strategies for Next-Generation Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Jinke Chang

    2018-01-01

    Full Text Available Additive manufacturing (AM has drawn tremendous attention in various fields. In recent years, great efforts have been made to develop novel additive manufacturing processes such as micro-/nano-scale 3D printing, bioprinting, and 4D printing for the fabrication of complex 3D structures with high resolution, living components, and multimaterials. The development of advanced functional materials is important for the implementation of these novel additive manufacturing processes. Here, a state-of-the-art review on advanced material strategies for novel additive manufacturing processes is provided, mainly including conductive materials, biomaterials, and smart materials. The advantages, limitations, and future perspectives of these materials for additive manufacturing are discussed. It is believed that the innovations of material strategies in parallel with the evolution of additive manufacturing processes will provide numerous possibilities for the fabrication of complex smart constructs with multiple functions, which will significantly widen the application fields of next-generation additive manufacturing.

  9. Advanced Material Strategies for Next-Generation Additive Manufacturing

    Science.gov (United States)

    Chang, Jinke; He, Jiankang; Zhou, Wenxing; Lei, Qi; Li, Xiao; Li, Dichen

    2018-01-01

    Additive manufacturing (AM) has drawn tremendous attention in various fields. In recent years, great efforts have been made to develop novel additive manufacturing processes such as micro-/nano-scale 3D printing, bioprinting, and 4D printing for the fabrication of complex 3D structures with high resolution, living components, and multimaterials. The development of advanced functional materials is important for the implementation of these novel additive manufacturing processes. Here, a state-of-the-art review on advanced material strategies for novel additive manufacturing processes is provided, mainly including conductive materials, biomaterials, and smart materials. The advantages, limitations, and future perspectives of these materials for additive manufacturing are discussed. It is believed that the innovations of material strategies in parallel with the evolution of additive manufacturing processes will provide numerous possibilities for the fabrication of complex smart constructs with multiple functions, which will significantly widen the application fields of next-generation additive manufacturing. PMID:29361754

  10. Modeling to Mars: a NASA Model Based Systems Engineering Pathfinder Effort

    Science.gov (United States)

    Phojanamongkolkij, Nipa; Lee, Kristopher A.; Miller, Scott T.; Vorndran, Kenneth A.; Vaden, Karl R.; Ross, Eric P.; Powell, Bobby C.; Moses, Robert W.

    2017-01-01

    The NASA Engineering Safety Center (NESC) Systems Engineering (SE) Technical Discipline Team (TDT) initiated the Model Based Systems Engineering (MBSE) Pathfinder effort in FY16. The goals and objectives of the MBSE Pathfinder include developing and advancing MBSE capability across NASA, applying MBSE to real NASA issues, and capturing issues and opportunities surrounding MBSE. The Pathfinder effort consisted of four teams, with each team addressing a particular focus area. This paper focuses on Pathfinder team 1 with the focus area of architectures and mission campaigns. These efforts covered the timeframe of February 2016 through September 2016. The team was comprised of eight team members from seven NASA Centers (Glenn Research Center, Langley Research Center, Ames Research Center, Goddard Space Flight Center IV&V Facility, Johnson Space Center, Marshall Space Flight Center, and Stennis Space Center). Collectively, the team had varying levels of knowledge, skills and expertise in systems engineering and MBSE. The team applied their existing and newly acquired system modeling knowledge and expertise to develop modeling products for a campaign (Program) of crew and cargo missions (Projects) to establish a human presence on Mars utilizing In-Situ Resource Utilization (ISRU). Pathfinder team 1 developed a subset of modeling products that are required for a Program System Requirement Review (SRR)/System Design Review (SDR) and Project Mission Concept Review (MCR)/SRR as defined in NASA Procedural Requirements. Additionally, Team 1 was able to perform and demonstrate some trades and constraint analyses. At the end of these efforts, over twenty lessons learned and recommended next steps have been identified.

  11. Citizens unite for computational immunology!

    Science.gov (United States)

    Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M

    2015-07-01

    Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Nonparametric additive regression for repeatedly measured data

    KAUST Repository

    Carroll, R. J.; Maity, A.; Mammen, E.; Yu, K.

    2009-01-01

    We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements

  13. Xenon plasma with caesium as additive

    Energy Technology Data Exchange (ETDEWEB)

    Stojilkovic, S M; Novakovic, N V; Zivkovic, L M

    1986-01-01

    The concentration dependence of xenon plasma with cesium as additive in the temperature range of 2000 K to 20,000 K is analyzed. Plasma is considered as weakly nonideal in complete local thermodynamic equilibrium and the interaction between plasma and vessel walls is not taken into account. The values of some of the parameters for nonideality of plasma with 1% of cesium (..gamma..=0.01010) and 10% of cesium (..gamma..=0.11111) are computed, for an initial pressure in plasma of p/sub 0/=13,000 Pa and initial temperature T/sub 0/=1000 K. The ratio of electric conductivity of plasma computed by Lorentz's formula and electric conductivity computed by Spitzer's formula in the same temperature interval is also analyzed. (author) 5 figs., 2 tabs., 16 refs.

  14. Additive manufacturing: From implants to organs

    African Journals Online (AJOL)

    Additive manufacturing (AM) constructs 3D objects layer by layer under computer control from 3D models. 3D printing is one ... anatomical models for surgery planning, and design and construction ... production of implants, particularly to replace bony structures, is ... Manufactured organs are, however, an elusive goal.

  15. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  16. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  17. Community effort endorsing multiscale modelling, multiscale data science and multiscale computing for systems medicine.

    Science.gov (United States)

    Zanin, Massimiliano; Chorbev, Ivan; Stres, Blaz; Stalidzans, Egils; Vera, Julio; Tieri, Paolo; Castiglione, Filippo; Groen, Derek; Zheng, Huiru; Baumbach, Jan; Schmid, Johannes A; Basilio, José; Klimek, Peter; Debeljak, Nataša; Rozman, Damjana; Schmidt, Harald H H W

    2017-12-05

    Systems medicine holds many promises, but has so far provided only a limited number of proofs of principle. To address this road block, possible barriers and challenges of translating systems medicine into clinical practice need to be identified and addressed. The members of the European Cooperation in Science and Technology (COST) Action CA15120 Open Multiscale Systems Medicine (OpenMultiMed) wish to engage the scientific community of systems medicine and multiscale modelling, data science and computing, to provide their feedback in a structured manner. This will result in follow-up white papers and open access resources to accelerate the clinical translation of systems medicine. © The Author 2017. Published by Oxford University Press.

  18. Web-based child pornography: The global impact of deterrence efforts and its consumption on mobile platforms.

    Science.gov (United States)

    Steel, Chad M S

    2015-06-01

    Our study is the first to look at mobile device use for child sexual exploitation material (CSEM) consumption, and at the global impact of deterrence efforts by search providers. We used data from Google, Bing, and Yandex to assess how web searches for CSEM are being conducted, both at present and historically. Our findings show that the blocking efforts by Google and Microsoft have resulted in a 67% drop in the past year in web-based searches for CSEM. Additionally, our findings show that mobile devices are a substantial platform for web-based consumption of CSEM, with tablets and smartphones representing 32% of all queries associated with CSEM conducted on Bing. Further, our findings show that a major search engine not located in the United States, Yandex, did not undertake blocking efforts similar to those implemented by Google and Microsoft and has seen no commensurate drop in CSEM searches and continues to profit from ad revenue on these queries. While the efforts by Google and Microsoft have had a deterrence effect in the United States, searchers from Russia and other locations where child pornography possession is not criminalized have continued to use these services. Additionally, the same lax enforcement environment has allowed searchers from the United States to utilize Yandex with little fear of detection or referral to United States law enforcement from the Russian authorities. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  20. Pocket money and child effort at school

    OpenAIRE

    François-Charles Wolff; Christine Barnet-Verzat

    2008-01-01

    In this paper, we study the relationship between the provision of parental pocket and the level of effort undertaken by the child at school. Under altruism, an increased amount of parental transfer should reduce the child's effort. Our empirical analysis is based on a French data set including about 1,400 parent-child pairs. We find that children do not undertake less effort when their parents are more generous.

  1. Security in Computer Applications

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development. The last part of the lecture covers some miscellaneous issues like the use of cryptography, rules for networking applications, and social engineering threats. This lecture was first given on Thursd...

  2. Terry Turbopump Analytical Modeling Efforts in Fiscal Year 2016 ? Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas; Ross, Kyle; Cardoni, Jeffrey N

    2018-04-01

    This document details the Fiscal Year 2016 modeling efforts to define the true operating limitations (margins) of the Terry turbopump systems used in the nuclear industry for Milestone 3 (full-scale component experiments) and Milestone 4 (Terry turbopump basic science experiments) experiments. The overall multinational-sponsored program creates the technical basis to: (1) reduce and defer additional utility costs, (2) simplify plant operations, and (3) provide a better understanding of the true margin which could reduce overall risk of operations.

  3. Computational modelling of oxygenation processes in enzymes and biomimetic model complexes

    OpenAIRE

    de Visser, Sam P.; Quesne, Matthew G.; Martin, Bodo; Comba, Peter; Ryde, Ulf

    2014-01-01

    With computational resources becoming more efficient and more powerful and at the same time cheaper, computational methods have become more and more popular for studies on biochemical and biomimetic systems. Although large efforts from the scientific community have gone into exploring the possibilities of computational methods on large biochemical systems, such studies are not without pitfalls and often cannot be routinely done but require expert execution. In this review we summarize and hig...

  4. Website Design Computer Stores Anugerah Jaya Using PHP

    OpenAIRE

    Budi Sulistiyo; Cut Asiana Gemawaty

    2006-01-01

    One use of the web in the business world today is as means for introducing a company and their products generate, publications and promotional activities are needed for development their efforts to be more recognized by the public. Sale and purchase of computer equipment and accessories is a growing business today needs community will encourage business computing devices is becoming ever developed. One of the facilities used for the promotion of alternative marketing the products sold are to ...

  5. Computers. A perspective on their usefulness in nuclear medicine

    International Nuclear Information System (INIS)

    Loken, M.K.; Williams, L.E.; Ponto, R.A.; Ganatra, R.D.; Raikar, U.; Samuel, A.M.

    1977-01-01

    To date, many symposia have been held on computer applications in nuclear medicine. Despite all of these efforts, an appraisal of the true utility of computers in the day-to-day practice of nuclear medicine is yet to be achieved. Now that the technology of data storage and processing in nuclear medicine has reached a high degree of sophistication, as evidenced by many reports in the literature, the time has come to develop a perspective on the proper place of computers in nuclear medicine practice. The paper summarizes various uses of a dedicated computer (Nuclear Data Med II) at our two institutions and comments on its clinical utility. (author)

  6. Biomechanical energy harvesting: generating electricity during walking with minimal user effort.

    Science.gov (United States)

    Donelan, J M; Li, Q; Naing, V; Hoffer, J A; Weber, D J; Kuo, A D

    2008-02-08

    We have developed a biomechanical energy harvester that generates electricity during human walking with little extra effort. Unlike conventional human-powered generators that use positive muscle work, our technology assists muscles in performing negative work, analogous to regenerative braking in hybrid cars, where energy normally dissipated during braking drives a generator instead. The energy harvester mounts at the knee and selectively engages power generation at the end of the swing phase, thus assisting deceleration of the joint. Test subjects walking with one device on each leg produced an average of 5 watts of electricity, which is about 10 times that of shoe-mounted devices. The cost of harvesting-the additional metabolic power required to produce 1 watt of electricity-is less than one-eighth of that for conventional human power generation. Producing substantial electricity with little extra effort makes this method well-suited for charging powered prosthetic limbs and other portable medical devices.

  7. Harnessing Vehicle Automation for Public Mobility -- An Overview of Ongoing Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Young, Stanley E.

    2015-11-05

    This presentation takes a look at the efforts to harness automated vehicle technology for public transport. The European CityMobil2 is the leading demonstration project in which automated shuttles were, or are planned to be, demonstrated in several cities and regions. The presentation provides a brief overview of the demonstrations at Oristano, Italy (July 2014), LaRochelle, France (Dec 2014), Lausanne, Switzerland (Apr 2015), Vantaa, Finland (July 2015), and Trikala, Greece (Sept 2015). In addition to technology exposition, the objectives included generating a legal framework for operation in each location and gaging the reaction of the public to unmanned shuttles, both of which were successfully achieved. Several such demonstrations are planned throughout the world, including efforts in North America in conjunction with the GoMentum Station in California. These early demonstration with low-speed automated shuttles provide a glimpse of the possible with a fully automated fleet of driverless vehicle providing a public transit service.

  8. Review of CERN Computer Centre Infrastructure

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The CERN Computer Centre is reviewing strategies for optimizing the use of the existing infrastructure in the future, and in the likely scenario that any extension will be remote from CERN, and in the light of the way other large facilities are today being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote computer centres. This presentation will give the details on the project’s motivations, current status and areas for future investigation.

  9. The Need for Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  10. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    Science.gov (United States)

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services

  11. Effort-Based Career Opportunities and Working Time

    OpenAIRE

    Bratti, M.; Staffolani, S.

    2005-01-01

    The authors evaluate the economic effects of the hypothesis of effort-based career opportunities, described as a situation in which a firm creates incentives for employees to work longer hours than bargained (or desired), by making career prospects depend on relative working hours. Firms' personnel management policies may tend to increase working time (or workers' effort) in order to maximize profits. Effort-based career opportunities raise working time, production and output per worker, and ...

  12. Web Solutions Inspire Cloud Computing Software

    Science.gov (United States)

    2013-01-01

    An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.

  13. Advances in Multimedia, Software Engineering and Computing Vol.1 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering ,Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  14. Advances in Multimedia, Software Engineering and Computing Vol.2 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering, Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  15. Hybrid Computational Model for High-Altitude Aeroassist Vehicles, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort addresses a need for accurate computational models to support aeroassist and entry vehicle system design over a broad range of flight conditions...

  16. Food additives and preschool children.

    Science.gov (United States)

    Martyn, Danika M; McNulty, Breige A; Nugent, Anne P; Gibney, Michael J

    2013-02-01

    Food additives have been used throughout history to perform specific functions in foods. A comprehensive framework of legislation is in place within Europe to control the use of additives in the food supply and ensure they pose no risk to human health. Further to this, exposure assessments are regularly carried out to monitor population intakes and verify that intakes are not above acceptable levels (acceptable daily intakes). Young children may have a higher dietary exposure to chemicals than adults due to a combination of rapid growth rates and distinct food intake patterns. For this reason, exposure assessments are particularly important in this age group. The paper will review the use of additives and exposure assessment methods and examine factors that affect dietary exposure by young children. One of the most widely investigated unfavourable health effects associated with food additive intake in preschool-aged children are suggested adverse behavioural effects. Research that has examined this relationship has reported a variety of responses, with many noting an increase in hyperactivity as reported by parents but not when assessed using objective examiners. This review has examined the experimental approaches used in such studies and suggests that efforts are needed to standardise objective methods of measuring behaviour in preschool children. Further to this, a more holistic approach to examining food additive intakes by preschool children is advisable, where overall exposure is considered rather than focusing solely on behavioural effects and possibly examining intakes of food additives other than food colours.

  17. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  18. Corrections and additions to CONTEMPT-LT computer codes for containment analysis

    International Nuclear Information System (INIS)

    Eerikaeinen, Lauri.

    1980-01-01

    The report presents a new version of CONTEMPT-LT/26 tainment code. The corrections and additions are applicable also to other CONTEMPT-LT versions. Thermodynamical background of corrections are shortly described, and in addition, some essential points which should be taken into account in the analysis of a pressure suppression containment have been pointed out. The results obtained by the corrected version have been compared to those calculated by the original program, and to the measured data in the Marviken containment experiment No 10. Finally, it has been indicated that for reliable pressure suppression analysis a wide ranging condensation model for air-steam mixture is necessary. (author)

  19. Computational and experimental studies of hydrodynamic instabilities and turbulent mixing (Review of NVIIEF efforts)

    International Nuclear Information System (INIS)

    Andronov, V.A.; Zhidov, I.G.; Meskov, E.E.; Nevmerzhitskii, N.V.; Nikiforov, V.V.; Razin, A.N.; Rogatchev, V.G.; Tolshmyakov, A.I.; Yanilkin, Yu.V.

    1995-02-01

    This report describes an extensive program of investigations conducted at Arzamas-16 in Russia over the past several decades. The focus of the work is on material interface instability and the mixing of two materials. Part 1 of the report discusses analytical and computational studies of hydrodynamic instabilities and turbulent mixing. The EGAK codes are described and results are illustrated for several types of unstable flow. Semiempirical turbulence transport equations are derived for the mixing of two materials, and their capabilities are illustrated for several examples. Part 2 discusses the experimental studies that have been performed to investigate instabilities and turbulent mixing. Shock-tube and jelly techniques are described in considerable detail. Results are presented for many circumstances and configurations

  20. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    Science.gov (United States)

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  1. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  2. Towards Process Support for Migrating Applications to Cloud Computing

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2012-01-01

    Cloud computing is an active area of research for industry and academia. There are a large number of organizations providing cloud computing infrastructure and services. In order to utilize these infrastructure resources and services, existing applications need to be migrated to clouds. However...... for supporting migration to cloud computing based on our experiences from migrating an Open Source System (OSS), Hackystat, to two different cloud computing platforms. We explained the process by performing a comparative analysis of our efforts to migrate Hackystate to Amazon Web Services and Google App Engine....... We also report the potential challenges, suitable solutions, and lesson learned to support the presented process framework. We expect that the reported experiences can serve guidelines for those who intend to migrate software applications to cloud computing....

  3. ESA NEOCC effort to eliminate high Palermo Scale virtual impactors

    Science.gov (United States)

    Micheli, M.; Koschny, D.; Hainaut, O.; Bernardi, F.

    2014-07-01

    At the moment of this writing about 4 % of the known near-Earth objects are known to have at least one future close approach scenario with a non-negligible collision probability within the next century, as routinely computed by the NEODyS and Sentry systems. The most straightforward way to improve the knowledge of the future dynamics of an NEO in order to exclude (or possibly confirm) some of these possible future impact is to obtain additional astrometric observations of the object as soon as it becomes observable again. In particular, since a large fraction (>98 %) of the known objects currently recognized as possible future impactors have been observed during a single opposition, this usually corresponds to obtaining a new set of observations during a second opposition, a so called ''recovery''. However, in some cases the future observability windows for the target after the discovery apparition may be very limited, either because the object is intrinsically small (and therefore requires a very close and consequently rare approach to become observable) or because its orbital dynamic prevents the observability from the ground for a long timespan (as in the case of quasi-resonant objects with a long synodic period). When this happens, the only short-term way to clarify an impact scenario is to look toward the past, and investigate the possibility that unrecognized detections of the object are already present in the databases of old astronomical images, which are often archived by professional telescopes and made available to the community a few months to years after they are exposed. We will here present an effort lead by the newly formed ESA NEO Coordination Centre (NEOCC) in Frascati to pursue both these avenues with the intent of improving the orbital knowledge of the highest-rated possible impactors, as defined by the Palermo Technical Impact Hazard Scale (PS in the following). As an example of our ongoing observational activities, we will first present our

  4. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    Science.gov (United States)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  5. Integrative computational approach for genome-based study of microbial lipid-degrading enzymes.

    Science.gov (United States)

    Vorapreeda, Tayvich; Thammarongtham, Chinae; Laoteng, Kobkul

    2016-07-01

    Lipid-degrading or lipolytic enzymes have gained enormous attention in academic and industrial sectors. Several efforts are underway to discover new lipase enzymes from a variety of microorganisms with particular catalytic properties to be used for extensive applications. In addition, various tools and strategies have been implemented to unravel the functional relevance of the versatile lipid-degrading enzymes for special purposes. This review highlights the study of microbial lipid-degrading enzymes through an integrative computational approach. The identification of putative lipase genes from microbial genomes and metagenomic libraries using homology-based mining is discussed, with an emphasis on sequence analysis of conserved motifs and enzyme topology. Molecular modelling of three-dimensional structure on the basis of sequence similarity is shown to be a potential approach for exploring the structural and functional relationships of candidate lipase enzymes. The perspectives on a discriminative framework of cutting-edge tools and technologies, including bioinformatics, computational biology, functional genomics and functional proteomics, intended to facilitate rapid progress in understanding lipolysis mechanism and to discover novel lipid-degrading enzymes of microorganisms are discussed.

  6. Perceived effort for motor control and decision-making.

    Directory of Open Access Journals (Sweden)

    Ignasi Cos

    2017-08-01

    Full Text Available How effort is internally quantified and how it influences both movement generation and decisions between potential movements are 2 difficult questions to answer. Physical costs are known to influence motor control and decision-making, yet we lack a general, principled characterization of how the perception of effort operates across tasks and conditions. Morel and colleagues introduce an insightful approach to that end, assessing effort indifference points and presenting a quadratic law between perceived effort and force production.

  7. Measuring Impact of EPAs Computational Toxicology Research (BOSC)

    Science.gov (United States)

    Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...

  8. Validation of transport models using additive flux minimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  9. Validation of transport models using additive flux minimization technique

    International Nuclear Information System (INIS)

    Pankin, A. Y.; Kruger, S. E.; Groebner, R. J.; Hakim, A.; Kritz, A. H.; Rafiq, T.

    2013-01-01

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile

  10. Materials Frontiers to Empower Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Antoinette Jane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarrao, John Louis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Richardson, Christopher [Laboratory for Physical Sciences, College Park, MD (United States)

    2015-06-11

    This is an exciting time at the nexus of quantum computing and materials research. The materials frontiers described in this report represent a significant advance in electronic materials and our understanding of the interactions between the local material and a manufactured quantum state. Simultaneously, directed efforts to solve materials issues related to quantum computing provide an opportunity to control and probe the fundamental arrangement of matter that will impact all electronic materials. An opportunity exists to extend our understanding of materials functionality from electronic-grade to quantum-grade by achieving a predictive understanding of noise and decoherence in qubits and their origins in materials defects and environmental coupling. Realizing this vision systematically and predictively will be transformative for quantum computing and will represent a qualitative step forward in materials prediction and control.

  11. Advances in Computer Science and Information Engineering Volume 2

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  12. Advances in Computer Science and Information Engineering Volume 1

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  13. New Challenges for Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Santoro, Alberto

    2003-01-01

    In view of the new scientific programs established for the LHC (Large Hadron Collider) era, the way to face the technological challenges in computing was develop a new concept of GRID computing. We show some examples and, in particular, a proposal for high energy physicists in countries like Brazil. Due to the big amount of data and the need of close collaboration it will be impossible to work in research centers and universities very far from Fermilab or CERN unless a GRID architecture is built. An important effort is being made by the international community to up to date their computing infrastructure and networks

  14. Phase transitions in least-effort communications

    International Nuclear Information System (INIS)

    Prokopenko, Mikhail; Ay, Nihat; Obst, Oliver; Polani, Daniel

    2010-01-01

    We critically examine a model that attempts to explain the emergence of power laws (e.g., Zipf's law) in human language. The model is based on the principle of least effort in communications—specifically, the overall effort is balanced between the speaker effort and listener effort, with some trade-off. It has been shown that an information-theoretic interpretation of this principle is sufficiently rich to explain the emergence of Zipf's law in the vicinity of the transition between referentially useless systems (one signal for all referable objects) and indexical reference systems (one signal per object). The phase transition is defined in the space of communication accuracy (information content) expressed in terms of the trade-off parameter. Our study explicitly solves the continuous optimization problem, subsuming a recent, more specific result obtained within a discrete space. The obtained results contrast Zipf's law found by heuristic search (that attained only local minima) in the vicinity of the transition between referentially useless systems and indexical reference systems, with an inverse-factorial (sub-logarithmic) law found at the transition that corresponds to global minima. The inverse-factorial law is observed to be the most representative frequency distribution among optimal solutions

  15. Men's Work Efforts and the Transition to Fatherhood.

    Science.gov (United States)

    Astone, Nan Marie; Dariotis, Jacinda; Sonenstein, Freya; Pleck, Joseph H; Hynes, Kathryn

    2010-03-01

    In this paper we tested three hypotheses: (a) the transition to fatherhood is associated with an increase in work effort; (b) the positive association (if any) between the transition to fatherhood and work effort is greater for fathers who are married at the time of the transition; and (c) the association (if any) is greater for men who make the transition at younger ages. The data are from the National Longitudinal Survey of Youth 1979 Cohort. The transition to fatherhood was associated with an increase in work effort among young unmarried men, but not for married men. Among married men who were on-time fathers, work effort decreased. Among childless men, the marriage transition was associated with increased work effort.

  16. SIGMA without effort

    International Nuclear Information System (INIS)

    Hagedorn, R.; Reinfelds, J.

    1978-01-01

    SIGMA (System for Interactive Graphical Analysis) is an interactive computing language with automatic array handling and graphical facilities. It is designed as a tool for mathematical problem solving. The SIGMA language is simple, almost obvious, yet flexible and powerful. This tutorial introduces the beginner to SIGMA. It is supposed to be used at a graphics terminal having access to SIGMA. The user will learn the language in dialogue with the system in sixteen sessions of about one hour. The first session enables him already to compute and display functions of one or two variables. (Auth.)

  17. Dissociating variability and effort as determinants of coordination.

    Directory of Open Access Journals (Sweden)

    Ian O'Sullivan

    2009-04-01

    Full Text Available When coordinating movements, the nervous system often has to decide how to distribute work across a number of redundant effectors. Here, we show that humans solve this problem by trying to minimize both the variability of motor output and the effort involved. In previous studies that investigated the temporal shape of movements, these two selective pressures, despite having very different theoretical implications, could not be distinguished; because noise in the motor system increases with the motor commands, minimization of effort or variability leads to very similar predictions. When multiple effectors with different noise and effort characteristics have to be combined, however, these two cost terms can be dissociated. Here, we measure the importance of variability and effort in coordination by studying how humans share force production between two fingers. To capture variability, we identified the coefficient of variation of the index and little fingers. For effort, we used the sum of squared forces and the sum of squared forces normalized by the maximum strength of each effector. These terms were then used to predict the optimal force distribution for a task in which participants had to produce a target total force of 4-16 N, by pressing onto two isometric transducers using different combinations of fingers. By comparing the predicted distribution across fingers to the actual distribution chosen by participants, we were able to estimate the relative importance of variability and effort of 1:7, with the unnormalized effort being most important. Our results indicate that the nervous system uses multi-effector redundancy to minimize both the variability of the produced output and effort, although effort costs clearly outweighed variability costs.

  18. Positioning Continuing Education Computer Programs for the Corporate Market.

    Science.gov (United States)

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  19. GATE: Improving the computational efficiency

    International Nuclear Information System (INIS)

    Staelens, S.; De Beenhouwer, J.; Kruecker, D.; Maigne, L.; Rannou, F.; Ferrer, L.; D'Asseler, Y.; Buvat, I.; Lemahieu, I.

    2006-01-01

    GATE is a software dedicated to Monte Carlo simulations in Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET). An important disadvantage of those simulations is the fundamental burden of computation time. This manuscript describes three different techniques in order to improve the efficiency of those simulations. Firstly, the implementation of variance reduction techniques (VRTs), more specifically the incorporation of geometrical importance sampling, is discussed. After this, the newly designed cluster version of the GATE software is described. The experiments have shown that GATE simulations scale very well on a cluster of homogeneous computers. Finally, an elaboration on the deployment of GATE on the Enabling Grids for E-Science in Europe (EGEE) grid will conclude the description of efficiency enhancement efforts. The three aforementioned methods improve the efficiency of GATE to a large extent and make realistic patient-specific overnight Monte Carlo simulations achievable

  20. How do different components of Effortful Control contribute to children's mathematics achievement?

    Science.gov (United States)

    Sánchez-Pérez, Noelia; Fuentes, Luis J; Pina, Violeta; López-López, Jose A; González-Salinas, Carmen

    2015-01-01

    This work sought to investigate the specific contribution of two different components of Effortful Control (EC) -attentional focusing (AF) and inhibitory control- to children's mathematics achievement. The sample was composed of 142 children aged 9-12 year-old. EC components were measured through the Temperament in Middle Childhood Questionnaire (TMCQ; parent's report); math achievement was measured via teacher's report and through the standard Woodcock-Johnson test. Additionally, the contribution of other cognitive and socio-emotional processes was taken into account. Our results showed that only AF significantly contributed to the variance of children's mathematics achievement; interestingly, mediational models showed that the relationship between effortful attentional self-regulation and mathematics achievement was mediated by academic peer popularity, as well as by intelligence and study skills. Results are discussed in the light of the current theories on the role of children's self-regulation abilities in the context of school.

  1. Computational protein design-the next generation tool to expand synthetic biology applications.

    Science.gov (United States)

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  2. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  3. 24 CFR 990.155 - Addition and deletion of units.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Addition and deletion of units. 990.155 Section 990.155 Housing and Urban Development Regulations Relating to Housing and Urban...; Computation of Eligible Unit Months § 990.155 Addition and deletion of units. (a) Changes in public housing...

  4. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.

    Science.gov (United States)

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-09-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.

  5. Not all effort is equal: the role of the anterior cingulate cortex in different forms of effort-reward decisions

    Directory of Open Access Journals (Sweden)

    Victoria eHolec

    2014-01-01

    Full Text Available The rat anterior cingulate cortex (ACC mediates effort-based decision making when the task requires the physical effort of climbing a ramp. Normal rats will readily climb a barrier leading to high reward whereas rats with ACC lesions will opt instead for an easily obtained small reward. The present study explored whether the role of ACC in cost-benefit decisions extends beyond climbing by testing its role in ramp climbing as well as two novel cost-benefit decision tasks, one involving the physical effort of lifting weights and the other the emotional cost of overcoming fear (i.e., courage. As expected, rats with extensive ACC lesions tested on a ramp-climbing task were less likely to choose a high-reward/high-effort arm than sham controls. However, during the first few trials, lesioned rats were as likely as controls to initially turn into the high-reward arm but far less likely to actually climb the barrier, suggesting that the role of the ACC is not in deciding which course of action to pursue, but rather in maintaining a course of action in the face of countervailing forces. In the effort-reward decision task involving weight lifting, some lesion animals behaved like controls while others avoided the high reward arm. However, the results were not statistically significant and a follow-up study using incremental increasing effort failed to show any difference between lesion and control groups. The results suggest that the ACC is not needed for effort-reward decisions involving weight lifting but may affect motor abilities. Finally, a courage task explored the willingness of rats to overcome the fear of crossing an open, exposed arm to obtain a high reward. Both sham and ACC-lesioned animals exhibited equal tendencies to enter the open arm. However, whereas sham animals gradually improved on the task, ACC-lesioned rats did not. Taken together, the results suggest that the role of the ACC in effort-reward decisions may be limited to certain

  6. Independent validation of the MMPI-2-RF Somatic/Cognitive and Validity scales in TBI Litigants tested for effort.

    Science.gov (United States)

    Youngjohn, James R; Wershba, Rebecca; Stevenson, Matthew; Sturgeon, John; Thomas, Michael L

    2011-04-01

    The MMPI-2 Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008) is replacing the MMPI-2 as the most widely used personality test in neuropsychological assessment, but additional validation studies are needed. Our study examines MMPI-2-RF Validity scales and the newly created Somatic/Cognitive scales in a recently reported sample of 82 traumatic brain injury (TBI) litigants who either passed or failed effort tests (Thomas & Youngjohn, 2009). The restructured Validity scales FBS-r (restructured symptom validity), F-r (restructured infrequent responses), and the newly created Fs (infrequent somatic responses) were not significant predictors of TBI severity. FBS-r was significantly related to passing or failing effort tests, and Fs and F-r showed non-significant trends in the same direction. Elevations on the Somatic/Cognitive scales profile (MLS-malaise, GIC-gastrointestinal complaints, HPC-head pain complaints, NUC-neurological complaints, and COG-cognitive complaints) were significant predictors of effort test failure. Additionally, HPC had the anticipated paradoxical inverse relationship with head injury severity. The Somatic/Cognitive scales as a group were better predictors of effort test failure than the RF Validity scales, which was an unexpected finding. MLS arose as the single best predictor of effort test failure of all RF Validity and Somatic/Cognitive scales. Item overlap analysis revealed that all MLS items are included in the original MMPI-2 Hy scale, making MLS essentially a subscale of Hy. This study validates the MMPI-2-RF as an effective tool for use in neuropsychological assessment of TBI litigants.

  7. Artificial Intelligence In Computational Fluid Dynamics

    Science.gov (United States)

    Vogel, Alison Andrews

    1991-01-01

    Paper compares four first-generation artificial-intelligence (Al) software systems for computational fluid dynamics. Includes: Expert Cooling Fan Design System (EXFAN), PAN AIR Knowledge System (PAKS), grid-adaptation program MITOSIS, and Expert Zonal Grid Generation (EZGrid). Focuses on knowledge-based ("expert") software systems. Analyzes intended tasks, kinds of knowledge possessed, magnitude of effort required to codify knowledge, how quickly constructed, performances, and return on investment. On basis of comparison, concludes Al most successful when applied to well-formulated problems solved by classifying or selecting preenumerated solutions. In contrast, application of Al to poorly understood or poorly formulated problems generally results in long development time and large investment of effort, with no guarantee of success.

  8. Edge computing technologies for Internet of Things: a primer

    Directory of Open Access Journals (Sweden)

    Yuan Ai

    2018-04-01

    Full Text Available With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe challenges, such as high latency, low Spectral Efficiency (SE, and non-adaptive machine type of communication. Motivated to solve these challenges, a new technology is driving a trend that shifts the function of centralized cloud computing to edge devices of networks. Several edge computing technologies originating from different backgrounds to decrease latency, improve SE, and support the massive machine type of communication have been emerging. This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing. In particular, the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared. From the viewpoint of radio access network, the differences between mobile edge computing and fog computing are highlighted, and the characteristics of fog computing-based radio access network are discussed. Finally, open issues and future research directions are identified as well. Keywords: Internet of Things (IoT, Mobile edge computing, Cloudlets, Fog computing

  9. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  10. Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization.

    Science.gov (United States)

    Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Wong, Wai Peng; Chen, Chun-Hung

    2017-04-01

    Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort.

  11. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  13. Programming effort analysis of the ELLPACK language

    Science.gov (United States)

    Rice, J. R.

    1978-01-01

    ELLPACK is a problem statement language and system for elliptic partial differential equations which is implemented by a FORTRAN preprocessor. ELLPACK's principal purpose is as a tool for the performance evaluation of software. However, it is used here as an example with which to study the programming effort required for problem solving. It is obvious that problem statement languages can reduce programming effort tremendously; the goal is to quantify this somewhat. This is done by analyzing the lengths and effort (as measured by Halstead's software science technique) of various approaches to solving these problems.

  14. Effort and Displeasure in People Who Are Hard of Hearing.

    Science.gov (United States)

    Matthen, Mohan

    2016-01-01

    Listening effort helps explain why people who are hard of hearing are prone to fatigue and social withdrawal. However, a one-factor model that cites only effort due to hardness of hearing is insufficient as there are many who lead happy lives despite their disability. This article explores other contributory factors, in particular motivational arousal and pleasure. The theory of rational motivational arousal predicts that some people forego listening comprehension because they believe it to be impossible and hence worth no effort at all. This is problematic. Why should the listening task be rated this way, given the availability of aids that reduce its difficulty? Two additional factors narrow the explanatory gap. First, we separate the listening task from the benefit derived as a consequence. The latter is temporally more distant, and is discounted as a result. The second factor is displeasure attributed to the listening task, which increases listening cost. Many who are hard of hearing enjoy social interaction. In such cases, the actual activity of listening is a benefit, not a cost. These people also reap the benefits of listening, but do not have to balance these against the displeasure of the task. It is suggested that if motivational harmony can be induced by training in somebody who is hard of hearing, then the obstacle to motivational arousal would be removed. This suggests a modified goal for health care professionals. Do not just teach those who are hard of hearing how to use hearing assistance devices. Teach them how to do so with pleasure and enjoyment.

  15. Creating an effort tracking tool to improve therapeutic cancer clinical trials workload management and budgeting.

    Science.gov (United States)

    James, Pam; Bebee, Patty; Beekman, Linda; Browning, David; Innes, Mathew; Kain, Jeannie; Royce-Westcott, Theresa; Waldinger, Marcy

    2011-11-01

    Quantifying data management and regulatory workload for clinical research is a difficult task that would benefit from a robust tool to assess and allocate effort. As in most clinical research environments, The University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) struggled to effectively allocate data management and regulatory time with frequently inaccurate estimates of how much time was required to complete the specific tasks performed by each role. In a dynamic clinical research environment in which volume and intensity of work ebbs and flows, determining requisite effort to meet study objectives was challenging. In addition, a data-driven understanding of how much staff time was required to complete a clinical trial was desired to ensure accurate trial budget development and effective cost recovery. Accordingly, the UMCCC CTO developed and implemented a Web-based effort-tracking application with the goal of determining the true costs of data management and regulatory staff effort in clinical trials. This tool was developed, implemented, and refined over a 3-year period. This article describes the process improvement and subsequent leveling of workload within data management and regulatory that enhanced the efficiency of UMCCC's clinical trials operation.

  16. A comparison of the additional protocols of the five nuclear weapon states and the ensuing safeguards benefits to international nonproliferation efforts

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, Eva C [Los Alamos National Laboratory; Sandoval, M Analisa [Los Alamos National Laboratory; Sandoval, Marisa N [Los Alamos National Laboratory; Boyer, Brian D [Los Alamos National Laboratory; Leitch, Rosalyn M [Los Alamos National Laboratory

    2009-01-01

    With the 6 January 2009 entry into force of the Additional Protocol by the United States of America, all five declared Nuclear Weapon States that are part of the Nonproliferation Treaty have signed, ratified, and put into force the Additional Protocol. This paper makes a comparison of the strengths and weaknesses of the five Additional Protocols in force by the five Nuclear Weapon States with respect to the benefits to international nonproliferation aims. This paper also documents the added safeguards burden to the five declared Nuclear Weapon States that these Additional Protocols put on the states with respect to access to their civilian nuclear programs and the hosting of complementary access activities as part of the Additional Protocol.

  17. Analog Integrated Circuit Design for Spike Time Dependent Encoder and Reservoir in Reservoir Computing Processors

    Science.gov (United States)

    2018-01-01

    HAS BEEN REVIEWED AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE CHIEF ENGINEER : / S / / S...bridged high-performance computing, nanotechnology , and integrated circuits & systems. 15. SUBJECT TERMS neuromorphic computing, neuron design, spike...multidisciplinary effort encompassed high-performance computing, nanotechnology , integrated circuits, and integrated systems. The project’s architecture was

  18. The Future of Additive Manufacturing in Air Force Acquisition

    Science.gov (United States)

    2017-03-22

    heal, if not cure, the acquisition woes and financial death spiral . 7 Additive Manufacturing as a Partial Solution to Acquisition Woes “As we... methodologies .”7 Additive manufacturing starts with a 3D model, derived either from a 3D scanner or created via software such as a Computer Aided Design (CAD

  19. Numerical computations with GPUs

    CERN Document Server

    Kindratenko, Volodymyr

    2014-01-01

    This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to

  20. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  1. CY15 Livermore Computing Focus Areas

    Energy Technology Data Exchange (ETDEWEB)

    Connell, Tom M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cupps, Kim C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); D' Hooge, Trent E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fahey, Tim J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fox, Dave M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Futral, Scott W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gary, Mark R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Goldstone, Robin J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hamilton, Pam G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Heer, Todd M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Long, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mark, Rich J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Morrone, Chris J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shoopman, Jerry D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Slavec, Joe A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Smith, David W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Springmeyer, Becky R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stearman, Marc D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Watson, Py C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-20

    The LC team undertook a survey of primary Center drivers for CY15. Identified key drivers included enhancing user experience and productivity, pre-exascale platform preparation, process improvement, data-centric computing paradigms and business expansion. The team organized critical supporting efforts into three cross-cutting focus areas; Improving Service Quality; Monitoring, Automation, Delegation and Center Efficiency; and Next Generation Compute and Data Environments In each area the team detailed high level challenges and identified discrete actions to address these issues during the calendar year. Identifying the Center’s primary drivers, issues, and plans is intended to serve as a lens focusing LC personnel, resources, and priorities throughout the year.

  2. Computing for magnetic fusion energy research: The next five years

    International Nuclear Information System (INIS)

    Mann, L.; Glasser, A.; Sauthoff, N.

    1991-01-01

    This report considers computing needs in magnetic fusion for the next five years. It is the result of two and a half years of effort by representatives of all aspects of the magnetic fusion community. The report also factors in the results of a survey that was distributed to the laboratories and universities that support fusion. There are four areas of computing support discussed: theory, experiment, engineering, and systems

  3. Motivation and effort in individuals with social anhedonia.

    Science.gov (United States)

    McCarthy, Julie M; Treadway, Michael T; Blanchard, Jack J

    2015-06-01

    It has been proposed that anhedonia may, in part, reflect difficulties in reward processing and effortful decision making. The current study aimed to replicate previous findings of effortful decision making deficits associated with elevated anhedonia and expand upon these findings by investigating whether these decision making deficits are specific to elevated social anhedonia or are also associated with elevated positive schizotypy characteristics. The current study compared controls (n=40) to individuals elevated on social anhedonia (n=30), and individuals elevated on perceptual aberration/magical ideation (n=30) on the Effort Expenditure for Rewards Task (EEfRT). Across groups, participants chose a higher proportion of hard tasks with increasing probability of reward and reward magnitude, demonstrating sensitivity to probability and reward values. Contrary to our expectations, when the probability of reward was most uncertain (50% probability), at low and medium reward values, the social anhedonia group demonstrated more effortful decision making than either individuals high in positive schizotypy or controls. The positive schizotypy group only differed from controls (making less effortful choices than controls) when reward probability was lowest (12%) and the magnitude of reward was the smallest. Our results suggest that social anhedonia is related to intact motivation and effort for monetary rewards, but that individuals with this characteristic display a unique and perhaps inefficient pattern of effort allocation when the probability of reward is most uncertain. Future research is needed to better understand effortful decision making and the processing of reward across a range of individual difference characteristics. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Time preferences, study effort, and academic performance

    NARCIS (Netherlands)

    Non, J.A.; Tempelaar, D.T.

    2014-01-01

    We analyze the relation between time preferences, study effort, and academic performance among first-year Business and Economics students. Time preferences are measured by stated preferences for an immediate payment over larger delayed payments. Data on study efforts are derived from an electronic

  5. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  6. Integrating and differentiating aspects of self-regulation: effortful control, executive functioning, and links to negative affectivity.

    Science.gov (United States)

    Bridgett, David J; Oddi, Kate B; Laake, Lauren M; Murdock, Kyle W; Bachmann, Melissa N

    2013-02-01

    Subdisciplines within psychology frequently examine self-regulation from different frameworks despite conceptually similar definitions of constructs. In the current study, similarities and differences between effortful control, based on the psychobiological model of temperament (Rothbart, Derryberry, & Posner, 1994), and executive functioning are examined and empirically tested in three studies (n = 509). Structural equation modeling indicated that effortful control and executive functioning are strongly associated and overlapping constructs (Study 1). Additionally, results indicated that effortful control is related to the executive function of updating/monitoring information in working memory, but not inhibition (Studies 2 and 3). Study 3 also demonstrates that better updating/monitoring information in working memory and better effortful control were uniquely linked to lower dispositional negative affect, whereas the executive function of low/poor inhibition was uniquely associated with an increased tendency to express negative affect. Furthermore, dispositional negative affect mediated the links between effortful control and, separately, the executive function of updating/monitoring information in working memory and the tendency to express negative affect. The theoretical implications of these findings are discussed, and a potential framework for guiding future work directed at integrating and differentiating aspects of self-regulation is suggested. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Discovery of technical methanation catalysts based on computational screening

    DEFF Research Database (Denmark)

    Sehested, Jens; Larsen, Kasper Emil; Kustov, Arkadii

    2007-01-01

    Methanation is a classical reaction in heterogeneous catalysis and significant effort has been put into improving the industrially preferred nickel-based catalysts. Recently, a computational screening study showed that nickel-iron alloys should be more active than the pure nickel catalyst and at ...

  8. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  9. A Brief Review of Computer-Assisted Approaches to Rational Design of Peptide Vaccines

    Directory of Open Access Journals (Sweden)

    Ashesh Nandy

    2016-05-01

    Full Text Available The growing incidences of new viral diseases and increasingly frequent viral epidemics have strained therapeutic and preventive measures; the high mutability of viral genes puts additional strains on developmental efforts. Given the high cost and time requirements for new drugs development, vaccines remain as a viable alternative, but there too traditional techniques of live-attenuated or inactivated vaccines have the danger of allergenic reactions and others. Peptide vaccines have, over the last several years, begun to be looked on as more appropriate alternatives, which are economically affordable, require less time for development and hold the promise of multi-valent dosages. The developments in bioinformatics, proteomics, immunogenomics, structural biology and other sciences have spurred the growth of vaccinomics where computer assisted approaches serve to identify suitable peptide targets for eventual development of vaccines. In this mini-review we give a brief overview of some of the recent trends in computer assisted vaccine development with emphasis on the primary selection procedures of probable peptide candidates for vaccine development.

  10. Consolidation of cloud computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall; Giordano, Domenico

    2017-01-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in resp...

  11. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A. [and others

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEU codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.

  12. Mechanical Properties of Additively Manufactured Thick Honeycombs

    Directory of Open Access Journals (Sweden)

    Reza Hedayati

    2016-07-01

    Full Text Available Honeycombs resemble the structure of a number of natural and biological materials such as cancellous bone, wood, and cork. Thick honeycomb could be also used for energy absorption applications. Moreover, studying the mechanical behavior of honeycombs under in-plane loading could help understanding the mechanical behavior of more complex 3D tessellated structures such as porous biomaterials. In this paper, we study the mechanical behavior of thick honeycombs made using additive manufacturing techniques that allow for fabrication of honeycombs with arbitrary and precisely controlled thickness. Thick honeycombs with different wall thicknesses were produced from polylactic acid (PLA using fused deposition modelling, i.e., an additive manufacturing technique. The samples were mechanically tested in-plane under compression to determine their mechanical properties. We also obtained exact analytical solutions for the stiffness matrix of thick hexagonal honeycombs using both Euler-Bernoulli and Timoshenko beam theories. The stiffness matrix was then used to derive analytical relationships that describe the elastic modulus, yield stress, and Poisson’s ratio of thick honeycombs. Finite element models were also built for computational analysis of the mechanical behavior of thick honeycombs under compression. The mechanical properties obtained using our analytical relationships were compared with experimental observations and computational results as well as with analytical solutions available in the literature. It was found that the analytical solutions presented here are in good agreement with experimental and computational results even for very thick honeycombs, whereas the analytical solutions available in the literature show a large deviation from experimental observation, computational results, and our analytical solutions.

  13. 15 CFR 930.114 - Secretarial mediation efforts.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Secretarial mediation efforts. 930.114... MANAGEMENT FEDERAL CONSISTENCY WITH APPROVED COASTAL MANAGEMENT PROGRAMS Secretarial Mediation § 930.114 Secretarial mediation efforts. (a) Following the close of the hearing, the hearing officer shall transmit the...

  14. Effort and Selection Effects of Incentive Contracts

    NARCIS (Netherlands)

    Bouwens, J.F.M.G.; van Lent, L.A.G.M.

    2003-01-01

    We show that the improved effort of employees associated with incentive contracts depends on the properties of the performance measures used in the contract.We also find that the power of incentives in the contract is only indirectly related to any improved employee effort.High powered incentive

  15. Stochastic evolutionary dynamics in minimum-effort coordination games

    Science.gov (United States)

    Li, Kun; Cong, Rui; Wang, Long

    2016-08-01

    The minimum-effort coordination game draws recently more attention for the fact that human behavior in this social dilemma is often inconsistent with the predictions of classical game theory. Here, we combine evolutionary game theory and coalescence theory to investigate this game in finite populations. Both analytic results and individual-based simulations show that effort costs play a key role in the evolution of contribution levels, which is in good agreement with those observed experimentally. Besides well-mixed populations, set structured populations have also been taken into consideration. Therein we find that large number of sets and moderate migration rate greatly promote effort levels, especially for high effort costs.

  16. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    Science.gov (United States)

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization

  17. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  18. New or improved computational methods and advanced reactor design

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Takeda, Toshikazu; Ushio, Tadashi

    1997-01-01

    Nuclear computational method has been studied continuously up to date, as a fundamental technology supporting the nuclear development. At present, research on computational method according to new theory and the calculating method thought to be difficult to practise are also continued actively to find new development due to splendid improvement of features of computer. In Japan, many light water type reactors are now in operations, new computational methods are induced for nuclear design, and a lot of efforts are concentrated for intending to more improvement of economics and safety. In this paper, some new research results on the nuclear computational methods and their application to nuclear design of the reactor were described for introducing recent trend of the nuclear design of the reactor. 1) Advancement of the computational method, 2) Reactor core design and management of the light water reactor, and 3) Nuclear design of the fast reactor. (G.K.)

  19. Laser Additive Manufacturing of Magnetic Materials

    Science.gov (United States)

    Mikler, C. V.; Chaudhary, V.; Borkar, T.; Soni, V.; Jaeger, D.; Chen, X.; Contieri, R.; Ramanujan, R. V.; Banerjee, R.

    2017-03-01

    While laser additive manufacturing is becoming increasingly important in the context of next-generation manufacturing technologies, most current research efforts focus on optimizing process parameters for the processing of mature alloys for structural applications (primarily stainless steels, titanium base, and nickel base alloys) from pre-alloyed powder feedstocks to achieve properties superior to conventionally processed counterparts. However, laser additive manufacturing or processing can also be applied to functional materials. This article focuses on the use of directed energy deposition-based additive manufacturing technologies, such as the laser engineered net shaping (LENS™) process, to deposit magnetic alloys. Three case studies are presented: Fe-30 at.%Ni, permalloys of the type Ni-Fe-V and Ni-Fe-Mo, and Fe-Si-B-Cu-Nb (derived from Finemet) alloys. All these alloys have been processed from a blend of elemental powders used as the feedstock, and their resultant microstructures, phase formation, and magnetic properties are discussed in this paper. Although these alloys were produced from a blend of elemental powders, they exhibited relatively uniform microstructures and comparable magnetic properties to those of their conventionally processed counterparts.

  20. Additive manufacturing: state-of-the-art and application framework

    Directory of Open Access Journals (Sweden)

    Vinícius Picanço Rodrigues

    2017-09-01

    Full Text Available Additive manufacturing encompasses a class of production processes with increasing applications in different areas and supply chains. Due to its flexibility for production in small batches and the versatility of materials and geometries, this technology is recognized as being capable of revolutionizing the production processes as well as changing production strategies that are currently employed. However, there are different technologies under the generic label of additive manufacturing, materials and application areas with different requirements. Given the growing importance of additive manufacturing as a production process, and also considering the need to have a better insight into the potential applications for driving research and development efforts, this article presents a proposal of organization for additive manufacturing applications in seven areas. Additionally, the article provides a panorama of the current development stage of this technology, with a review of its major technological variants. The results presented aim to serve as a basis to support driving initiatives in additive manufacturing in companies, development agencies and research institutions.

  1. TUNL computer facilities

    International Nuclear Information System (INIS)

    Boyd, M.; Edwards, S.E.; Gould, C.R.; Roberson, N.R.; Westerfeldt, C.R.

    1985-01-01

    The XSYS system has been relatively stable during the last year, and most of our efforts have involved routine software maintenance and enhancement of existing XSYS capabilities. Modifications were made in the MBD program GDAP to increase the execution speed in key GDAP routines. A package of routines has been developed to allow communication between the XSYS and the new Wien filter microprocessor. Recently the authors have upgraded their operating system from VSM V3.7 to V4.1. This required numerous modifications to XSYS, mostly in the command procedures. A new reorganized edition of the XSYS manual will be issued shortly. The TUNL High Resolution Laboratory's VAX 11/750 computer has been in operation for its first full year as a replacement for the PRIME 300 computer which was purchased in 1974 and retired nine months ago. The data acquisition system on the VAX has been in use for the past twelve months performing a number of experiments

  2. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  3. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  4. Computational Design of Batteries from Materials to Systems

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Kandler A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Santhanagopalan, Shriram [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yang, Chuanbo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Graf, Peter A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Usseglio Viretta, Francois L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Li, Qibo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Finegan, Donal [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yao, Koffi (Pierre) [Argonne National Laboratory; Abraham, Daniel [Argonne National Laboratory; Dees, Dennis [Argonne National Laboratory; Jansen, Andy [Argonne National Laboratory; Mukherjee, Partha [Texas A& M University; Mistry, Aashutosh [Texas A& M University; Verma, Ankit [Texas A& M University; Lamb, Josh [Sandia National Laboratories; Darcy, Eric [NASA

    2017-09-01

    Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.

  5. Recruiting Women into Computer Science and Information Systems

    Science.gov (United States)

    Broad, Steven; McGee, Meredith

    2014-01-01

    While many technical disciplines have reached or are moving toward gender parity in the number of bachelors degrees in those fields, the percentage of women graduating in computer science remains stubbornly low. Many recent efforts to address this situation have focused on retention of undergraduate majors or graduate students, recruiting…

  6. Status of development of a code for predicting the migration of ground additions - MOGRA

    International Nuclear Information System (INIS)

    Amano, Hikaru; Uchida, Shigeo; Matsuoka, Syungo; Ikeda, Hiroshi; Hayashi, Hiroko; Kurosawa, Naohiro

    2003-01-01

    MOGRA (Migration Of GRound Additions) is a migration prediction code for toxic ground additions including radioactive materials in a terrestrial environment. MOGRA consists of computational codes that are applicable to various evaluation target systems, and can be used on personal computers. The computational code has the dynamic compartment analysis block at its core, the graphical user interface (GUI) for computation parameter settings and results displays, data bases and so on. The compartments are obtained by classifying various natural environments into groups that exhibit similar properties. These codes are able to create or delete compartments and set the migration of environmental-load substances between compartments by a simple mouse operation. The system features universality and excellent expandability in the application of computations to various nuclides. (author)

  7. Quantum Computers: A New Paradigm in Information Technology

    Directory of Open Access Journals (Sweden)

    Mahesh S. Raisinghani

    2001-01-01

    Full Text Available The word 'quantum' comes from the Latin word quantus meaning 'how much'. Quantum computing is a fundamentally new mode of information processing that can be performed only by harnessing physical phenomena unique to quantum mechanics (especially quantum interference. Paul Benioff of the Argonne National Laboratory first applied quantum theory to computers in 1981 and David Deutsch of Oxford proposed quantum parallel computers in 1985, years before the realization of qubits in 1995. However, it may be well into the 21st century before we see quantum computing used at a commercial level for a variety of reasons discussed in this paper. The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This paper discusses some of the current advances, applications, and chal-lenges of quantum computing as well as its impact on corporate computing and implications for management. It shows how quantum computing can be utilized to process and store information, as well as impact cryptography for perfectly secure communication, algorithmic searching, factorizing large numbers very rapidly, and simulating quantum-mechanical systems efficiently. A broad interdisciplinary effort will be needed if quantum com-puters are to fulfill their destiny as the world's fastest computing devices.

  8. Second-order particle-in-cell (PIC) computational method in the one-dimensional variable Eulerian mesh system

    International Nuclear Information System (INIS)

    Pyun, J.J.

    1981-01-01

    As part of an effort to incorporate the variable Eulerian mesh into the second-order PIC computational method, a truncation error analysis was performed to calculate the second-order error terms for the variable Eulerian mesh system. The results that the maximum mesh size increment/decrement is limited to be α(Δr/sub i/) 2 where Δr/sub i/ is a non-dimensional mesh size of the ith cell, and α is a constant of order one. The numerical solutions of Burgers' equation by the second-order PIC method in the variable Eulerian mesh system wer compared with its exact solution. It was found that the second-order accuracy in the PIC method was maintained under the above condition. Additional problems were analyzed using the second-order PIC methods in both variable and uniform Eulerian mesh systems. The results indicate that the second-order PIC method in the variable Eulerian mesh system can provide substantial computational time saving with no loss in accuracy

  9. Distance Learning and Cloud Computing: "Just Another Buzzword or a Major E-Learning Breakthrough?"

    Science.gov (United States)

    Romiszowski, Alexander J.

    2012-01-01

    "Cloud computing is a model for the enabling of ubiquitous, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and other services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." This…

  10. Computer graphics in piping structural engineering

    International Nuclear Information System (INIS)

    Revesz, Z.

    1985-01-01

    Computer graphics in piping structural engineering is gaining in popularity. The large number of systems, the growing complexity of the load cases and structure models require human assimilation of large amounts of data. An effort has been made to enlighten evaluation of numerical data and visualize as much of it as possible, thus eliminating a source of error and accelerating analysis/reporting. The product of this effort is PAID, the Piping Analysis and Interactive Design software. While developing PAID, interest has been focused on the acceleration of the work done mainly by PIPESTRESS. Some installed and tested capabilities of PAID are presented in this paper. Examples are given from the graphic output in report form and the conversation necessary to get such is demonstrated. (orig.)

  11. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  12. Making a difference: Ten case studies of DSM/IRP interactive efforts and related advocacy group activities

    Energy Technology Data Exchange (ETDEWEB)

    English, M.; Schexnayder, S.; Altman, J. [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center; Schweitzer, M. [Oak Ridge National Lab., TN (United States)

    1994-03-01

    This report discusses the activities of organizations that seek to promote integrated resource planning and aggressive, cost-effective demand-side management by utilities. The activities of such groups -- here called energy efficiency advocacy groups (EEAGs) -- are examined in ten detailed am studies. Nine of the cases involve some form of interactive effort between investor-owned electric utilities and non-utility to develop policies, plans, or programs cooperatively. Many but not all of the interactive efforts examined are formal collaboratives. In addition, all ten cases include discussion of other EEAG activities, such as coalition-building, research, participation in statewide energy planning, and intervention in regulatory proceedings.

  13. The effects of freedom of choice in action selection on perceived mental effort and the sense of agency.

    Science.gov (United States)

    Barlas, Zeynep; Hockley, William E; Obhi, Sukhvinder S

    2017-10-01

    Previous research showed that increasing the number of action alternatives enhances the sense of agency (SoA). Here, we investigated whether choice space could affect subjective judgments of mental effort experienced during action selection and examined the link between subjective effort and the SoA. Participants performed freely selected (among two, three, or four options) and instructed actions that produced pleasant or unpleasant tones. We obtained action-effect interval estimates to quantify intentional binding - the perceived interval compression between actions and outcomes and feeling of control (FoC) ratings. Additionally, participants reported the degree of mental effort they experienced during action selection. We found that both binding and FoC were systematically enhanced with increasing choice-level. Outcome valence did not influence binding, while FoC was stronger for pleasant than unpleasant outcomes. Finally, freely chosen actions were associated with low subjective effort and slow responses (i.e., higher reaction times), and instructed actions were associated with high effort and fast responses. Although the conditions that yielded the greatest and least subjective effort also yielded the greatest and least binding and FoC, there was no significant correlation between subjective effort and SoA measures. Overall, our results raise interesting questions about how agency may be influenced by response selection demands (i.e., indexed by speed of responding) and subjective mental effort. Our work also highlights the importance of understanding how subjective mental effort and response speed are related to popular notions of fluency in response selection. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. The effect of task demand and incentive on neurophysiological and cardiovascular markers of effort.

    Science.gov (United States)

    Fairclough, Stephen H; Ewing, Kate

    2017-09-01

    According to motivational intensity theory, effort is proportional to the level of task demand provided that success is possible and successful performance is deemed worthwhile. The current study represents a simultaneous manipulation of demand (working memory load) and success importance (financial incentive) to investigate neurophysiological (EEG) and cardiovascular measures of effort. A 2×2 repeated-measures study was conducted where 18 participants performed a n-back task under three conditions of demand: easy (1-back), hard (4-back) and very hard (7-back). In addition, participants performed these tasks in the presence of performance-contingent financial incentive or in a no-incentive (pilot trial) condition. Three bands of EEG activity were quantified: theta (4-7Hz), lower-alpha (7.5-10Hz) and upper-alpha (10.5-13Hz). Fronto-medial activity in the theta band and activity in the upper-alpha band at frontal, central and parietal sites were sensitive to demand and indicated greatest effort when the task was challenging and success was possible. Mean systolic blood pressure and activity in the lower-alpha band at parietal sites were also sensitive to demand but also increased in the incentive condition across all levels of task demand. The results of the study largely support the predictions of motivational intensity using neurophysiological markers of effort. Copyright © 2017. Published by Elsevier B.V.

  15. An application of interactive computer graphics technology to the design of dispersal mechanisms

    Science.gov (United States)

    Richter, B. J.; Welch, B. H.

    1977-01-01

    Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.

  16. Introduction to computer data representation

    CERN Document Server

    Fenwick, Peter

    2014-01-01

    Introduction to Computer Data Representation introduces readers to the representation of data within computers. Starting from basic principles of number representation in computers, the book covers the representation of both integer and floating point numbers, and characters or text. It comprehensively explains the main techniques of computer arithmetic and logical manipulation. The book also features chapters covering the less usual topics of basic checksums and 'universal' or variable length representations for integers, with additional coverage of Gray Codes, BCD codes and logarithmic repre

  17. Linear-XOR and Additive Checksums Don't Protect Damgard-Merkle Hashes

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Kelsey, John

    2008-01-01

    We consider the security of Damg\\aa{}rd-Merkle variants which compute linear-XOR or additive checksums over message blocks, intermediate hash values, or both, and process these checksums in computing the final hash value. We show that these Damg\\aa{}rd-Merkle variants gain almost no security...

  18. Single-Step BLUP with Varying Genotyping Effort in Open-Pollinated Picea glauca

    Directory of Open Access Journals (Sweden)

    Blaise Ratcliffe

    2017-03-01

    Full Text Available Maximization of genetic gain in forest tree breeding programs is contingent on the accuracy of the predicted breeding values and precision of the estimated genetic parameters. We investigated the effect of the combined use of contemporary pedigree information and genomic relatedness estimates on the accuracy of predicted breeding values and precision of estimated genetic parameters, as well as rankings of selection candidates, using single-step genomic evaluation (HBLUP. In this study, two traits with diverse heritabilities [tree height (HT and wood density (WD] were assessed at various levels of family genotyping efforts (0, 25, 50, 75, and 100% from a population of white spruce (Picea glauca consisting of 1694 trees from 214 open-pollinated families, representing 43 provenances in Québec, Canada. The results revealed that HBLUP bivariate analysis is effective in reducing the known bias in heritability estimates of open-pollinated populations, as it exposes hidden relatedness, potential pedigree errors, and inbreeding. The addition of genomic information in the analysis considerably improved the accuracy in breeding value estimates by accounting for both Mendelian sampling and historical coancestry that were not captured by the contemporary pedigree alone. Increasing family genotyping efforts were associated with continuous improvement in model fit, precision of genetic parameters, and breeding value accuracy. Yet, improvements were observed even at minimal genotyping effort, indicating that even modest genotyping effort is effective in improving genetic evaluation. The combined utilization of both pedigree and genomic information may be a cost-effective approach to increase the accuracy of breeding values in forest tree breeding programs where shallow pedigrees and large testing populations are the norm.

  19. Single-Step BLUP with Varying Genotyping Effort in Open-Pollinated Picea glauca.

    Science.gov (United States)

    Ratcliffe, Blaise; El-Dien, Omnia Gamal; Cappa, Eduardo P; Porth, Ilga; Klápště, Jaroslav; Chen, Charles; El-Kassaby, Yousry A

    2017-03-10

    Maximization of genetic gain in forest tree breeding programs is contingent on the accuracy of the predicted breeding values and precision of the estimated genetic parameters. We investigated the effect of the combined use of contemporary pedigree information and genomic relatedness estimates on the accuracy of predicted breeding values and precision of estimated genetic parameters, as well as rankings of selection candidates, using single-step genomic evaluation (HBLUP). In this study, two traits with diverse heritabilities [tree height (HT) and wood density (WD)] were assessed at various levels of family genotyping efforts (0, 25, 50, 75, and 100%) from a population of white spruce ( Picea glauca ) consisting of 1694 trees from 214 open-pollinated families, representing 43 provenances in Québec, Canada. The results revealed that HBLUP bivariate analysis is effective in reducing the known bias in heritability estimates of open-pollinated populations, as it exposes hidden relatedness, potential pedigree errors, and inbreeding. The addition of genomic information in the analysis considerably improved the accuracy in breeding value estimates by accounting for both Mendelian sampling and historical coancestry that were not captured by the contemporary pedigree alone. Increasing family genotyping efforts were associated with continuous improvement in model fit, precision of genetic parameters, and breeding value accuracy. Yet, improvements were observed even at minimal genotyping effort, indicating that even modest genotyping effort is effective in improving genetic evaluation. The combined utilization of both pedigree and genomic information may be a cost-effective approach to increase the accuracy of breeding values in forest tree breeding programs where shallow pedigrees and large testing populations are the norm. Copyright © 2017 Ratcliffe et al.

  20. Computing, Information and Communications Technology (CICT) Website

    Science.gov (United States)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  1. Non-additive measures theory and applications

    CERN Document Server

    Narukawa, Yasuo; Sugeno, Michio; 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012)

    2014-01-01

    This book provides a comprehensive and timely report in the area of non-additive measures and integrals. It is based on a panel session on fuzzy measures, fuzzy integrals and aggregation operators held during the 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012) in Girona, Spain, November 21-23, 2012. The book complements the MDAI 2012 proceedings book, published in Lecture Notes in Computer Science (LNCS) in 2012. The individual chapters, written by key researchers in the field, cover fundamental concepts and important definitions (e.g. the Sugeno integral, definition of entropy for non-additive measures) as well some important applications (e.g. to economics and game theory) of non-additive measures and integrals. The book addresses students, researchers and practitioners working at the forefront of their field.  

  2. SERVIR Support to NSDI Efforts in Mesoamerica, Africa and the Himalayas

    Science.gov (United States)

    Delgado, Francisco

    2014-01-01

    SERVIR is a joint effort between NASA, USAID to build or improve capacities in developing regions to help adaptation to climate change by taking advantage of Earth Observation data for decision making. The project began in 2004, in Mesoamerica, partnering with the Central American Commission for Environment and Development(CCAD), the World Bank and CATHALAC. CATHALAC, located in Panama, serves as the regional hub for Mesoamerica since 2005. Two additional regional hubs have been established (in Easters & Western Africa - at RCMRD, Kenya, and The Himalayas- at ICIMOD, Nepal), and two more regional hubs are soon to be launched.

  3. Computer-aided stress analysis system for nuclear plant primary components

    International Nuclear Information System (INIS)

    Murai, Tsutomu; Tokumaru, Yoshio; Yamazaki, Junko.

    1980-06-01

    Generally it needs a vast quantity of calculation to make the stress analysis reports of nuclear plant primary components. In Japan, especially, stress analysis reports are under obligation to make for each plant. In Mitsubishi Heavy Industries, Ltd., We have been making great efforts to rationalize the process of analysis for about these ten years. As the result of rationalization up to now, a computer-aided stress analysis system using graphic display, graphic tablet, data file, etc. was accomplished and it needs us only the least hand work. In addition we developed a fracture safety analysis system. And we are going to develop the input generator system for 3-dimensional FEM analysis by graphics terminals in the near future. We expect that when the above-mentioned input generator system is accomplished, it will be possible for us to solve instantly any case of problem. (author)

  4. Computing Stability Effects of Mutations in Human Superoxide Dismutase 1

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2014-01-01

    Protein stability is affected in several diseases and is of substantial interest in efforts to correlate genotypes to phenotypes. Superoxide dismutase 1 (SOD1) is a suitable test case for such correlations due to its abundance, stability, available crystal structures and thermochemical data......, and physiological importance. In this work, stability changes of SOD1 mutations were computed with five methods, CUPSAT, I-Mutant2.0, I-Mutant3.0, PoPMuSiC, and SDM, with emphasis on structural sensitivity as a potential issue in structure-based protein calculation. The large correlation between experimental...... literature data of SOD1 dimers and monomers (r = 0.82) suggests that mutations in separate protein monomers are mostly additive. PoPMuSiC was most accurate (typical MAE ∼ 1 kcal/mol, r ∼ 0.5). The relative performance of the methods was not very structure-dependent, and the more accurate methods also...

  5. Evaluation of Advanced Polymers for Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Rios, Orlando [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carter, William G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kutchko, Cindy [PPG Industries, Pittsburgh, PA (United States); Fenn, David [PPG Industries, Pittsburgh, PA (United States); Olson, Kurt [PPG Industries, Pittsburgh, PA (United States)

    2017-09-08

    The goal of this Manufacturing Demonstration Facility (MDF) technical collaboration project between Oak Ridge National Laboratory (ORNL) and PPG Industries, Inc. (PPG) was to evaluate the feasibility of using conventional coatings chemistry and technology to build up material layer-by-layer. The PPG-ORNL study successfully demonstrated that polymeric coatings formulations may overcome many limitations of common thermoplastics used in additive manufacturing (AM), allow lightweight nozzle design for material deposition, and increase build rate. The materials effort focused on layer-by-layer deposition of coatings with each layer fusing together. The combination of materials and deposition results in an additively manufactured build that has sufficient mechanical properties to bear the load of additional layers, yet is capable of bonding across the z-layers to improve build direction strength. The formulation properties were tuned to enable a novel, high-throughput deposition method that is highly scalable, compatible with high loading of reinforcing fillers, and inherently low-cost.

  6. Effect of Active Videogames on Underserved Children's Classroom Behaviors, Effort, and Fitness.

    Science.gov (United States)

    Gao, Zan; Lee, Jung Eun; Pope, Zachary; Zhang, Dachao

    2016-09-30

    The purpose of this study was to examine the effect of active videogames (AVGs) on underserved minority children's on-task classroom behavior, academic effort, and fitness. A one group pre- and posttest repeated measures design was used. In Fall 2013, 95 fourth grade children (57 boys, 38 girls; 96% of minority) from three classes at an underserved urban elementary school participated in teacher-supervised AVG activities (e.g., Wii Sports, Xbox Just Dance). Specifically, students participated in a 50-minute weekly AVG program at school for 6 weeks. Children's academic effort was evaluated by classroom teachers using a validated scale that assessed activity, attention, conduct, and social/emotional behavior. Moreover, children's classroom behavior was observed immediately before and after each AVG session by trained researchers. Finally, cardiovascular fitness was also measured. A paired t-test was used to assess teacher-rated student effort, while one-way (gender) analysis of variance (ANOVA) with repeated measures was performed to analyze children's on-task classroom behavior. There was a significant effect on children's effort between the first (mean = 3.24, SD = 0.75) and last week (mean = 3.41, SD = 0.73) assessments, t = 2.42, P = 0.02. In addition, there was a significant effect on classroom behavior, F = 33.103, P < 0.01. In detail, children scored significantly higher on on-task behavior during the post-AVG observation (mean = 81.4, SD = 12.3) than seen during the pre-AVG observation (mean = 69.8, SD = 14.9). However, no main effect was indicated for gender, F = 0.39, P = 0.54. No significant improvement in cardiovascular fitness was observed, although slight improvements were seen. Offering an AVG program at school could improve underserved minority children's classroom on-task behavior and academic effort. Future studies may include a control group to further confirm the effectiveness of AVG

  7. Heuristics Made Easy: An Effort-Reduction Framework

    Science.gov (United States)

    Shah, Anuj K.; Oppenheimer, Daniel M.

    2008-01-01

    In this article, the authors propose a new framework for understanding and studying heuristics. The authors posit that heuristics primarily serve the purpose of reducing the effort associated with a task. As such, the authors propose that heuristics can be classified according to a small set of effort-reduction principles. The authors use this…

  8. Computational efficiency for the surface renewal method

    Science.gov (United States)

    Kelley, Jason; Higgins, Chad

    2018-04-01

    Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.

  9. Educational Technology and the Restructuring Movement: Lessons from Research on Computers in Classrooms.

    Science.gov (United States)

    Kell, Diane; And Others

    This paper presents findings from a recently completed study of the use of computers in primary classrooms as one source of evidence concerning the role technology can play in school restructuring efforts. The sites for the study were selected by Apple Computer, Inc. in the spring of 1988 and included 43 classrooms in 10 schools in 6 large, mostly…

  10. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  11. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  12. Spin wave Feynman diagram vertex computation package

    Science.gov (United States)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  13. Sensitivity to mental effort and test-retest reliability of heart rate variability measures in healthy seniors.

    Science.gov (United States)

    Mukherjee, Shalini; Yadav, Rajeev; Yung, Iris; Zajdel, Daniel P; Oken, Barry S

    2011-10-01

    To determine (1) whether heart rate variability (HRV) was a sensitive and reliable measure in mental effort tasks carried out by healthy seniors and (2) whether non-linear approaches to HRV analysis, in addition to traditional time and frequency domain approaches were useful to study such effects. Forty healthy seniors performed two visual working memory tasks requiring different levels of mental effort, while ECG was recorded. They underwent the same tasks and recordings 2 weeks later. Traditional and 13 non-linear indices of HRV including Poincaré, entropy and detrended fluctuation analysis (DFA) were determined. Time domain, especially mean R-R interval (RRI), frequency domain and, among non-linear parameters - Poincaré and DFA were the most reliable indices. Mean RRI, time domain and Poincaré were also the most sensitive to different mental effort task loads and had the largest effect size. Overall, linear measures were the most sensitive and reliable indices to mental effort. In non-linear measures, Poincaré was the most reliable and sensitive, suggesting possible usefulness as an independent marker in cognitive function tasks in healthy seniors. A large number of HRV parameters was both reliable as well as sensitive indices of mental effort, although the simple linear methods were the most sensitive. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  14. The Effect of Age on Listening Effort

    Science.gov (United States)

    Degeest, Sofie; Keppler, Hannah; Corthals, Paul

    2015-01-01

    Purpose: The objective of this study was to investigate the effect of age on listening effort. Method: A dual-task paradigm was used to evaluate listening effort in different conditions of background noise. Sixty adults ranging in age from 20 to 77 years were included. A primary speech-recognition task and a secondary memory task were performed…

  15. Addition of visual noise boosts evoked potential-based brain-computer interface.

    Science.gov (United States)

    Xie, Jun; Xu, Guanghua; Wang, Jing; Zhang, Sicong; Zhang, Feng; Li, Yeping; Han, Chengcheng; Li, Lili

    2014-05-14

    Although noise has a proven beneficial role in brain functions, there have not been any attempts on the dedication of stochastic resonance effect in neural engineering applications, especially in researches of brain-computer interfaces (BCIs). In our study, a steady-state motion visual evoked potential (SSMVEP)-based BCI with periodic visual stimulation plus moderate spatiotemporal noise can achieve better offline and online performance due to enhancement of periodic components in brain responses, which was accompanied by suppression of high harmonics. Offline results behaved with a bell-shaped resonance-like functionality and 7-36% online performance improvements can be achieved when identical visual noise was adopted for different stimulation frequencies. Using neural encoding modeling, these phenomena can be explained as noise-induced input-output synchronization in human sensory systems which commonly possess a low-pass property. Our work demonstrated that noise could boost BCIs in addressing human needs.

  16. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  17. Enhanced computational infrastructure for data analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; McHarg, B.B.; Meyer, W.H.; Parker, C.T.

    2000-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from nine national laboratories, 19 foreign laboratories, 16 universities, and five industrial partnerships. As a result of this work, DIII-D data is available on a 24x7 basis from a set of viewing and analysis tools that can be run on either the collaborators' or DIII-D's computer systems. Additionally, a web based data and code documentation system has been created to aid the novice and expert user alike

  18. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.; McCharg, B.B.

    1999-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike

  19. The Influence of Air Void Content on Moisture Damage Susceptibility of Asphalt Mixtures : A Computational Study

    NARCIS (Netherlands)

    Varveri, A.; Avgerinopoulos, S.; Kasbergen, C.; Scarpas, T.; Collop, A.

    2014-01-01

    Because of the difficulties associated with the generation of finite element meshes based on X-ray computed tomography scans and with the extraordinary computational demands in performing three-dimensional (3-D) finite element analyses, past modeling efforts have focused primarily on two-dimensional

  20. Investigation of accuracy and computation time of a hierarchy of growth rate definitions

    International Nuclear Information System (INIS)

    Maudlin, P.J.; Borg, R.C.; Ott, K.O.

    1977-07-01

    A numerical illustration of the hierarchy of four logically different procedures for the calculation of the asymptotic growth of fast breeder fuel is presented. Each hierarchy level is analyzed in terms of accuracy and computational effort. Using the first procedure as reference, the fourth procedure, which incorporates the isotopic breeding worths, w vector*, requires a minimum amount of effort with a negligible decrease in accuracy