WorldWideScience

Sample records for additional computational effort

  1. EU grid computing effort takes on malaria

    CERN Multimedia

    Lawrence, Stacy

    2006-01-01

    Malaria is the world's most common parasitic infection, affecting more thatn 500 million people annually and killing more than 1 million. In order to help combat malaria, CERN has launched a grid computing effort (1 page)

  2. NASA OSMA NDE Program Additive Manufacturing Foundational Effort

    Science.gov (United States)

    Waller, Jess; Walker, James; Burke, Eric; Wells, Douglas; Nichols, Charles

    2016-01-01

    NASA is providing key leadership in an international effort linking NASA and non-NASA resources to speed adoption of additive manufacturing (AM) to meet NASA's mission goals. Participants include industry, NASA's space partners, other government agencies, standards organizations and academia. Nondestructive Evaluation (NDE) is identified as a universal need for all aspects of additive manufacturing.

  3. Efforts to transform computers reach milestone

    CERN Multimedia

    Johnson, G

    2001-01-01

    Scientists in San Jose, Californina, have performed the most complex calculation ever using a quantum computer - factoring the number 15. In contast to the switches in conventional computers, which although tiny consist of billions of atoms, quantum computations are carried out by manipulating single atoms. The laws of quantum mechanics which govern these actions in fact mean that multiple computations could be done in parallel, this would drastically cut down the time needed to carry out very complex calculations.

  4. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  5. Defense Additive Manufacturing: DOD Needs to Systematically Track Department-wide 3D Printing Efforts

    Science.gov (United States)

    2015-10-01

    the chair of the group, while the teams each have some level of activity in additive manufacturing , it is not identified as one of the teams. Page...DEFENSE ADDITIVE MANUFACTURING DOD Needs to Systematically Track Department-wide 3D Printing Efforts Report to...2015 to 00-00-2015 4. TITLE AND SUBTITLE Defense Additive Manufacturing : DOD Needs to Systematically Track Department-wide 3D Printing Efforts 5a

  6. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  7. Developing a framework for assessing muscle effort and postures during computer work in the field: The effect of computer activities on neck/shoulder muscle effort and postures

    NARCIS (Netherlands)

    Garza, J.L.B.; Eijckelhof, B.H.W.; Johnson, P.W.; Raina, S.M.; Rynell, P.; Huysmans, M.A.; Dieën, J.H. van; Beek, A.J. van der; Blatter, B.M.; Dennerlein, J.T.

    2012-01-01

    The present study, a part of the PROOF (PRedicting Occupational biomechanics in OFfice workers) study, aimed to determine whether trapezius muscle effort was different across computer activities in a field study of computer workers, and also investigated whether head and shoulder postures were

  8. Developing a framework for assessing muscle effort and postures during computer work in the field: the effect of computer activities on neck/shoulder muscle effort and postures

    NARCIS (Netherlands)

    Bruno-Garza, J.L.; Eijckelhof, B.H.W.; Johnson, P.W.; Raina, S.M.; Rynell, P.; Huijsmans, M.A.; van Dieen, J.H.; van der Beek, A.J.; Blatter, B.M.; Dennerlein, J.T.

    2012-01-01

    The present study, a part of the PROOF (PRedicting Occupational biomechanics in OFfice workers) study, aimed to determine whether trapezius muscle effort was different across computer activities in a field study of computer workers, and also investigated whether head and shoulder postures were

  9. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    Science.gov (United States)

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Enhancing school-based asthma education efforts using computer-based education for children.

    Science.gov (United States)

    Nabors, Laura A; Kockritz, Jennifer L; Ludke, Robert L; Bernstein, Jonathan A

    2012-03-01

    Schools are an important site for delivery of asthma education programs. Computer-based educational programs are a critical component of asthma education programs and may be a particularly important education method in busy school environments. The objective of this brief report is to review and critique computer-based education efforts in schools. The results of our literature review indicated that school-based computer education efforts are related to improved knowledge about asthma and its management. In some studies, improvements in clinical outcomes also occur. Data collection programs need to be built into games that improve knowledge. Many projects do not appear to last for periods greater than 1 year and little information is available about cultural relevance of these programs. Educational games and other programs are effective methods of delivering knowledge about asthma management and control. Research about the long-term effects of this increased knowledge, in regard to behavior change, is needed. Additionally, developing sustainable projects, which are culturally relevant, is a goal for future research.

  11. Department of Energy Efforts to Promote Universal Adherence to the IAEA Additional Protocol

    International Nuclear Information System (INIS)

    Killinger, Mark H.; Hansen, Linda H.; Kovacic, Don N.; VanSickle, Matthew; Apt, Kenneth E.

    2009-01-01

    Entry-into-force of the U.S. Additional Protocol (AP) in January 2009 continues to demonstrate the ongoing commitment by the United States to promote universal adherence to the AP. The AP is a critical tool for improving the International Atomic Energy Agency's (IAEA) capabilities to detect undeclared activities that indicate a clandestine nuclear weapons program. This is because States Parties are required to provide information about, and access to, nuclear fuel cycle activities beyond their traditional safeguards reporting requirements. As part of the U.S. AP Implementation Act and Senate Resolution of Ratification, the Administration is required to report annually to Congress on measures taken to achieve the adoption of the AP in non-nuclear weapon states, as well as assistance to the IAEA to promote the effective implementation of APs in those states. A key U.S. effort in this area is being managed by the International Nuclear Safeguards and Engagement Program (INSEP) of the U.S. Department of Energy (DOE). Through new and existing bilateral cooperation agreements, INSEP has initiated technical assistance projects for AP implementation with selected non-weapon states. States with which INSEP is currently cooperating include Vietnam and Thailand, with Indonesia, Algeria, Morocco, and other countries as possible future collaborators in the area of AP implementation. The INSEP collaborative model begins with a joint assessment with our partners to identify specific needs they may have regarding entering the AP into force and any impediments to successful implementation. An action plan is then developed detailing and prioritizing the necessary joint activities. Such assistance may include: advice on developing legal frameworks and regulatory documents; workshops to promote understanding of AP requirements; training to determine possible declarable activities; assistance in developing a system to collect and submit declarations; performing industry outreach to

  12. Motivational beliefs, student effort, and feedback behaviour in computer-based formative assessment

    NARCIS (Netherlands)

    Timmers, C.F.; Braber-van den Broek, J.; van den Berg, Stéphanie Martine

    2013-01-01

    Feedback can only be effective when students seek feedback and process it. This study examines the relations between students' motivational beliefs, effort invested in a computer-based formative assessment, and feedback behaviour. Feedback behaviour is represented by whether a student seeks feedback

  13. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort

    Directory of Open Access Journals (Sweden)

    Eliana Vassena

    2017-06-01

    Full Text Available In the last two decades the anterior cingulate cortex (ACC has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  14. DOD Financial Management: Additional Efforts Needed to Improve Audit Readiness of Navy Military Pay and Other Related Activities

    Science.gov (United States)

    2015-09-01

    of Major Systems Involved in Processing and Reporting Navy Military Payroll 8 Figure 3: Management Representation Letter Timeline for the April 2013...Figure 3: Management Representation Letter Timeline for the April 2013 Military Payroll Examination Without a policy that addresses...DOD FINANCIAL MANAGEMENT Additional Efforts Needed to Improve Audit Readiness of Navy Military Pay and Other Related

  15. DARPA-funded efforts in the development of novel brain-computer interface technologies.

    Science.gov (United States)

    Miranda, Robbin A; Casebeer, William D; Hein, Amy M; Judy, Jack W; Krotkov, Eric P; Laabs, Tracy L; Manzo, Justin E; Pankratz, Kent G; Pratt, Gill A; Sanchez, Justin C; Weber, Douglas J; Wheeler, Tracey L; Ling, Geoffrey S F

    2015-04-15

    The Defense Advanced Research Projects Agency (DARPA) has funded innovative scientific research and technology developments in the field of brain-computer interfaces (BCI) since the 1970s. This review highlights some of DARPA's major advances in the field of BCI, particularly those made in recent years. Two broad categories of DARPA programs are presented with respect to the ultimate goals of supporting the nation's warfighters: (1) BCI efforts aimed at restoring neural and/or behavioral function, and (2) BCI efforts aimed at improving human training and performance. The programs discussed are synergistic and complementary to one another, and, moreover, promote interdisciplinary collaborations among researchers, engineers, and clinicians. Finally, this review includes a summary of some of the remaining challenges for the field of BCI, as well as the goals of new DARPA efforts in this domain. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  16. UMRSFFS Additional Mapping Effort

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — Recent developments in digital terrain and geospatial database management technology make it possible to protect this investment for existing and future projects to...

  17. X-ray computed tomography for additive manufacturing: a review

    International Nuclear Information System (INIS)

    Thompson, A; Maskery, I; Leach, R K

    2016-01-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM. (topical review)

  18. X-ray computed tomography for additive manufacturing: a review

    Science.gov (United States)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  19. Technology and Sexuality--What's the Connection? Addressing Youth Sexualities in Efforts to Increase Girls' Participation in Computing

    Science.gov (United States)

    Ashcraft, Catherine

    2015-01-01

    To date, girls and women are significantly underrepresented in computer science and technology. Concerns about this underrepresentation have sparked a wealth of educational efforts to promote girls' participation in computing, but these programs have demonstrated limited impact on reversing current trends. This paper argues that this is, in part,…

  20. Integrated Computational Material Engineering Technologies for Additive Manufacturing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — QuesTek Innovations, a pioneer in Integrated Computational Materials Engineering (ICME) and a Tibbetts Award recipient, is teaming with University of Pittsburgh,...

  1. Additional extensions to the NASCAP computer code, volume 3

    Science.gov (United States)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  2. On the additive splitting procedures and their computer realization

    DEFF Research Database (Denmark)

    Farago, I.; Thomsen, Per Grove; Zlatev, Z.

    2008-01-01

    Two additive splitting procedures are defined and studied in this paper. It is shown that these splitting procedures have good stability properties. Some other splitting procedures, which are traditionally used in mathematical models used in many scientific and engineering fields, are sketched. All...

  3. Human Capital: Additional Actions Needed to Enhance DOD’s Efforts to Address Mental Health Care Stigma

    Science.gov (United States)

    2016-04-01

    Civilians (n=21) 6 5 8 2 h) Substance abuse (alcohol or drugs ) Servicemembers (n= 186) 43 53 82 8 Civilians (n=22) 9 8 4 1 Source: GAO | GAO...Efforts to Address Mental Health Care Stigma Why GAO Did This Study A 2010 DOD task force on suicide prevention concluded that stigma—the negative...Representatives A 2010 Department of Defense (DOD) Task Force on the Prevention of Suicide by Members of the Armed Forces concluded that

  4. Context-dependent memory decay is evidence of effort minimization in motor learning: a computational study.

    Science.gov (United States)

    Takiyama, Ken

    2015-01-01

    Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this transfer of the learning effect can be reproduced by certain theoretical frameworks. Although most theoretical frameworks have assumed that a motor memory trained with a certain movement decays at the same speed during performing the trained movement as non-trained movements, a recent study reported that the motor memory decays faster during performing the trained movement than non-trained movements, i.e., the decay rate of motor memory is movement or context dependent. Although motor learning has been successfully modeled based on an optimization framework, e.g., movement error minimization, the type of optimization that can lead to context-dependent memory decay is unclear. Thus, context-dependent memory decay raises the question of what is optimized in motor learning. To reproduce context-dependent memory decay, I extend a motor primitive framework. Specifically, I introduce motor effort optimization into the framework because some previous studies have reported the existence of effort optimization in motor learning processes and no conventional motor primitive model has yet considered the optimization. Here, I analytically and numerically revealed that context-dependent decay is a result of motor effort optimization. My analyses suggest that context-dependent decay is not merely memory decay but is evidence of motor effort optimization in motor learning.

  5. Context-dependent memory decay is evidence of effort minimization in motor learning: A computational study

    Directory of Open Access Journals (Sweden)

    Ken eTakiyama

    2015-02-01

    Full Text Available Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this transfer of the learning effect can be reproduced by certain theoretical frameworks. Although most theoretical frameworks have assumed that a motor memory trained with a certain movement decays at the same speed during performing the trained movement as non-trained movements, a recent study reported that the motor memory decays faster during performing the trained movement than non-trained movements, i.e., the decay rate of motor memory is movement or context dependent. Although motor learning has been successfully modeled based on an optimization framework, e.g., movement error minimization, the type of optimization that can lead to context-dependent memory decay is unclear. Thus, context-dependent memory decay raises the question of what is optimized in motor learning. To reproduce context-dependent memory decay, I extend a motor primitive framework. Specifically, I introduce motor effort optimization into the framework because some previous studies have reported the existence of effort optimization in motor learning processes and no conventional motor primitive model has yet considered the optimization. Here, I analytically and numerically revealed that context-dependent decay is a result of motor effort optimization. My analyses suggest that context-dependent decay is not merely memory decay but is evidence of motor effort optimization in motor learning.

  6. [Effects of nurses' perception of servant leadership on leader effectiveness, satisfaction and additional effort: focused on the mediating effects of leader trust and value congruence].

    Science.gov (United States)

    Han, Sang Sook; Kim, Nam Eun

    2012-02-01

    This study was done to examine the effects of nurses' perception of servant leadership on leader effectiveness, satisfaction and promoting additional effort. The focus was the mediating effects of leader trust and value congruence. Data were collected from 361 RN-BSN students and nurses participating in nationally attended in-service training programs. Data were analyzed using descriptive statistics and structural analysis with SPSS 17.0 windows program and Amos 7.0. Direct effects of nurses' perception of servant leadership were negative, but mediating effects of trust and value congruency were positively correlated with leader effectiveness, satisfaction and additional effort, that is servant leadership should be effective through mediating factors. The study results indicate that if the middle managers of nurses can build leader trust and value congruency between nurses through servant leadership, leader effectiveness, satisfaction and additional effort on the part of the nurses could result in a positive change in the long term.

  7. Artificial Neural Networks for Reducing Computational Effort in Active Truncated Model Testing of Mooring Lines

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye; Voie, Per Erlend Torbergsen; Høgsberg, Jan Becker

    2015-01-01

    simultaneously, this method is very demanding in terms of numerical efficiency and computational power. Therefore, this method has not yet proved to be feasible. It has recently been shown how a hybrid method combining classical numerical models and artificial neural networks (ANN) can provide a dramatic...... prior to the experiment and with a properly trained ANN it is no problem to obtain accurate simulations much faster than real time-without any need for large computational capacity. The present study demonstrates how this hybrid method can be applied to the active truncated experiments yielding a system...

  8. Affective medicine. A review of affective computing efforts in medical informatics.

    Science.gov (United States)

    Luneski, A; Konstantinidis, E; Bamidis, P D

    2010-01-01

    Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as "computing that relates to, arises from, or deliberately influences emotions". AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field.

  9. Robotic disaster recovery efforts with ad-hoc deployable cloud computing

    Science.gov (United States)

    Straub, Jeremy; Marsh, Ronald; Mohammad, Atif F.

    2013-06-01

    Autonomous operations of search and rescue (SaR) robots is an ill posed problem, which is complexified by the dynamic disaster recovery environment. In a typical SaR response scenario, responder robots will require different levels of processing capabilities during various parts of the response effort and will need to utilize multiple algorithms. Placing these capabilities onboard the robot is a mediocre solution that precludes algorithm specific performance optimization and results in mediocre performance. Architecture for an ad-hoc, deployable cloud environment suitable for use in a disaster response scenario is presented. Under this model, each service provider is optimized for the task and maintains a database of situation-relevant information. This service-oriented architecture (SOA 3.0) compliant framework also serves as an example of the efficient use of SOA 3.0 in an actual cloud application.

  10. Community effort endorsing multiscale modelling, multiscale data science and multiscale computing for systems medicine.

    Science.gov (United States)

    Zanin, Massimiliano; Chorbev, Ivan; Stres, Blaz; Stalidzans, Egils; Vera, Julio; Tieri, Paolo; Castiglione, Filippo; Groen, Derek; Zheng, Huiru; Baumbach, Jan; Schmid, Johannes A; Basilio, José; Klimek, Peter; Debeljak, Nataša; Rozman, Damjana; Schmidt, Harald H H W

    2017-12-05

    Systems medicine holds many promises, but has so far provided only a limited number of proofs of principle. To address this road block, possible barriers and challenges of translating systems medicine into clinical practice need to be identified and addressed. The members of the European Cooperation in Science and Technology (COST) Action CA15120 Open Multiscale Systems Medicine (OpenMultiMed) wish to engage the scientific community of systems medicine and multiscale modelling, data science and computing, to provide their feedback in a structured manner. This will result in follow-up white papers and open access resources to accelerate the clinical translation of systems medicine. © The Author 2017. Published by Oxford University Press.

  11. Computational and experimental studies of hydrodynamic instabilities and turbulent mixing (Review of NVIIEF efforts)

    International Nuclear Information System (INIS)

    Andronov, V.A.; Zhidov, I.G.; Meskov, E.E.; Nevmerzhitskii, N.V.; Nikiforov, V.V.; Razin, A.N.; Rogatchev, V.G.; Tolshmyakov, A.I.; Yanilkin, Yu.V.

    1995-02-01

    This report describes an extensive program of investigations conducted at Arzamas-16 in Russia over the past several decades. The focus of the work is on material interface instability and the mixing of two materials. Part 1 of the report discusses analytical and computational studies of hydrodynamic instabilities and turbulent mixing. The EGAK codes are described and results are illustrated for several types of unstable flow. Semiempirical turbulence transport equations are derived for the mixing of two materials, and their capabilities are illustrated for several examples. Part 2 discusses the experimental studies that have been performed to investigate instabilities and turbulent mixing. Shock-tube and jelly techniques are described in considerable detail. Results are presented for many circumstances and configurations

  12. A Methodology to Reduce the Computational Effort in the Evaluation of the Lightning Performance of Distribution Networks

    Directory of Open Access Journals (Sweden)

    Ilaria Bendato

    2016-11-01

    Full Text Available The estimation of the lightning performance of a power distribution network is of great importance to design its protection system against lightning. An accurate evaluation of the number of lightning events that can create dangerous overvoltages requires a huge computational effort, as it implies the adoption of a Monte Carlo procedure. Such a procedure consists of generating many different random lightning events and calculating the corresponding overvoltages. The paper proposes a methodology to deal with the problem in two computationally efficient ways: (i finding out the minimum number of Monte Carlo runs that lead to reliable results; and (ii setting up a procedure that bypasses the lightning field-to-line coupling problem for each Monte Carlo run. The proposed approach is shown to provide results consistent with existing approaches while exhibiting superior Central Processing Unit (CPU time performances.

  13. Spirometry-Assisted High Resolution Chest Computed Tomography in Children: Is it Worth the Effort?

    Science.gov (United States)

    Otjen, Jeffrey Parke; Swanson, Jonathan Ogden; Oron, Assaf; DiBlasi, Robert M; Swortzel, Tim; van Well, Jade Adriana Marie; Gommers, Eva Anna Elisabeth; Rosenfeld, Margaret

    Image quality of high resolution chest computed tomographies (HRCTs) depends on adequate breath holds at end inspiration and end expiration. We hypothesized that implementation of spirometry-assisted breath holds in children undergoing HRCTs would improve image quality over that obtained with voluntary breath holds by decreasing motion artifact and atelectasis. This is a retrospective case-control study of HRCTs obtained at a tertiary care children's hospital before and after implementation of a spirometry-assisted CT protocol, in which children ≥8 years of age are first trained in supine slow vital capacity maneuvers and then repeat the maneuvers in the CT scanner, coached by a respiratory therapist. Spirometry-assisted CT scans (cases) were matched by age, gender and diagnosis (cystic fibrosis vs other) to CT scans obtained with voluntary breath holds in the 6 years before implementation of the spirometry assistance protocol (controls), and evaluated by 2 blinded pediatric radiologists. Among both cases and controls (N = 50 each), 10 carried the diagnosis of cystic fibrosis and 40 had other diagnoses. Mean age was 12.9 years (range: 7.5-20.1) among cases and 13.0 (7.1-19.7) among controls. Mean (SD) inspiratory image density among cases was -852 (37) Hounsfield units (HU) and -828 (43) among controls (p = 0.006). Mean (SD) expiratory image density was -629 (95) HU among cases and -688 (83) HU among controls (p = 0.002). Mean (SD) change in image density between inspiratory and expiratory images was +222 (85) HU among cases and +140 (76) HU among controls (p 0.80). Atelectasis was present on inspiratory images in 8 cases and 9 controls and on expiratory images in 9 cases and 10 controls (p > 0.80). Spirometry-assisted CTs had a significantly greater difference in lung density between inspiratory and expiratory scans than those performed with voluntary breath holds, likely improving the ability to detect air trapping. No appreciable difference in image quality

  14. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  15. ``But it doesn't come naturally'': how effort expenditure shapes the benefit of growth mindset on women's sense of intellectual belonging in computing

    Science.gov (United States)

    Stout, Jane G.; Blaney, Jennifer M.

    2017-10-01

    Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.

  16. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    Science.gov (United States)

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Adsorption of molecular additive onto lead halide perovskite surfaces: A computational study on Lewis base thiophene additive passivation

    Science.gov (United States)

    Zhang, Lei; Yu, Fengxi; Chen, Lihong; Li, Jingfa

    2018-06-01

    Organic additives, such as the Lewis base thiophene, have been successfully applied to passivate halide perovskite surfaces, improving the stability and properties of perovskite devices based on CH3NH3PbI3. Yet, the detailed nanostructure of the perovskite surface passivated by additives and the mechanisms of such passivation are not well understood. This study presents a nanoscopic view on the interfacial structure of an additive/perovskite interface, consisting of a Lewis base thiophene molecular additive and a lead halide perovskite surface substrate, providing insights on the mechanisms that molecular additives can passivate the halide perovskite surfaces and enhance the perovskite-based device performance. Molecular dynamics study on the interactions between water molecules and the perovskite surfaces passivated by the investigated additive reveal the effectiveness of employing the molecular additives to improve the stability of the halide perovskite materials. The additive/perovskite surface system is further probed via molecular engineering the perovskite surfaces. This study reveals the nanoscopic structure-property relationships of the halide perovskite surface passivated by molecular additives, which helps the fundamental understanding of the surface/interface engineering strategies for the development of halide perovskite based devices.

  18. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    Science.gov (United States)

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO

  19. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    Directory of Open Access Journals (Sweden)

    Andreas Gansäuer

    2013-08-01

    Full Text Available The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG‡ and ΔGR are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically.

  20. Development of computer code SIMPSEX for simulation of FBR fuel reprocessing flowsheets: II. additional benchmarking results

    International Nuclear Information System (INIS)

    Shekhar Kumar; Koganti, S.B.

    2003-07-01

    Benchmarking and application of a computer code SIMPSEX for high plutonium FBR flowsheets was reported recently in an earlier report (IGC-234). Improvements and recompilation of the code (Version 4.01, March 2003) required re-validation with the existing benchmarks as well as additional benchmark flowsheets. Improvements in the high Pu region (Pu Aq >30 g/L) resulted in better results in the 75% Pu flowsheet benchmark. Below 30 g/L Pu Aq concentration, results were identical to those from the earlier version (SIMPSEX Version 3, code compiled in 1999). In addition, 13 published flowsheets were taken as additional benchmarks. Eleven of these flowsheets have a wide range of feed concentrations and few of them are β-γ active runs with FBR fuels having a wide distribution of burnup and Pu ratios. A published total partitioning flowsheet using externally generated U(IV) was also simulated using SIMPSEX. SIMPSEX predictions were compared with listed predictions from conventional SEPHIS, PUMA, PUNE and PUBG. SIMPSEX results were found to be comparable and better than the result from above listed codes. In addition, recently reported UREX demo results along with AMUSE simulations are also compared with SIMPSEX predictions. Results of the benchmarking SIMPSEX with these 14 benchmark flowsheets are discussed in this report. (author)

  1. PAH growth initiated by propargyl addition: Mechanism development and computational kinetics

    KAUST Repository

    Raj, Abhijeet Dhayal

    2014-04-24

    Polycyclic aromatic hydrocarbon (PAH) growth is known to be the principal pathway to soot formation during fuel combustion, as such, a physical understanding of the PAH growth mechanism is needed to effectively assess, predict, and control soot formation in flames. Although the hydrogen abstraction C2H2 addition (HACA) mechanism is believed to be the main contributor to PAH growth, it has been shown to under-predict some of the experimental data on PAHs and soot concentrations in flames. This article presents a submechanism of PAH growth that is initiated by propargyl (C 3H3) addition onto naphthalene (A2) and the naphthyl radical. C3H3 has been chosen since it is known to be a precursor of benzene in combustion and has appreciable concentrations in flames. This mechanism has been developed up to the formation of pyrene (A4), and the temperature-dependent kinetics of each elementary reaction has been determined using density functional theory (DFT) computations at the B3LYP/6-311++G(d,p) level of theory and transition state theory (TST). H-abstraction, H-addition, H-migration, β-scission, and intramolecular addition reactions have been taken into account. The energy barriers of the two main pathways (H-abstraction and H-addition) were found to be relatively small if not negative, whereas the energy barriers of the other pathways were in the range of (6-89 kcal·mol-1). The rates reported in this study may be extrapolated to larger PAH molecules that have a zigzag site similar to that in naphthalene, and the mechanism presented herein may be used as a complement to the HACA mechanism to improve prediction of PAH and soot formation. © 2014 American Chemical Society.

  2. Addition computed tomography with stable xenon; Special reference to ischemic cerebrovascular diseases

    Energy Technology Data Exchange (ETDEWEB)

    Touho, Hajime; Karasawa, Jun; Shishido, Hisashi; Yamada, Keisuke; Shibamoto, Keiji [Osaka Neurological Inst., Toyonaka (Japan)

    1990-09-01

    Stable xenon (Xe{sup s}) is used as a contrast agent because it freely diffuses to cerebral tissues through the blood-brain barrier. In this study, 2 axial levels for Xe{sup s} enhancement analysis were selected from a baseline series of computed tomographic (CT) scans and 6 serial CT scans were obtained every 20 seconds for each scan level during the 240 seconds inhalation period of 30% Xe{sup s} in 10 volunteer controls and in 52 patients with ischemic cerebrovascular diseases (ICVD). The serial CT scans were added and averaged in each pixel. This was used to make a new CT picture (addition CT scans). The CT scans before the Xe{sup s} inhalation, the scan at the end of the Xe{sup s} inhalation, and the addition CT scan were compared to see whether gray matter and ischemic areas could be differentiated from white matter. The addition CT scans could differentiate the three structures very well in both the acute and chronic stages of ICVD. This technique is thought to be a very simple and useful method to detect the small infarcted areas and low perfusion areas that cannot be visualized on precontrast CT scans. (author).

  3. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    Science.gov (United States)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  4. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Energy Technology Data Exchange (ETDEWEB)

    King, W. E., E-mail: weking@llnl.gov [Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A. [Engineering Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Kamath, C. [Computation Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Rubenchik, A. M. [NIF and Photon Sciences Directorate, Lawrence Livermore National Laboratory, Livermore, California 94550 (United States)

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  5. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    International Nuclear Information System (INIS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-01-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process

  6. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    Science.gov (United States)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  7. [Analysis and evaluation of the visual effort in remote-control public traffic operators working with computer-based equipments].

    Science.gov (United States)

    Gullà, F; Zambelli, P; Bergamaschi, A; Piccoli, B

    2007-01-01

    The aim of this study is the objective evaluation of the visual effort in 6 public traffic controllers (4 male, 2 female, mean age 29,6), by means of electronic equipment. The electronic equipment quantify the observation distance and the observation time for each controller's occupational visual field. The quantification of these parameters is obtained by the emission of ultrasound at 40 KHz from an emission sensor (placed by the VDT screen) and the ultrasound reception by means of a receiving sensor (placed on the operator's head). The travelling time of the ultrasound (US), as the air speed is known and costant (about 340 m/s), it is used to calculate the distance between the emitting and the receiving sensor. The results show that the visual acuity required is of average level, while accommodation's and convergence's effort vary from average to intense (depending on the visual characteristics of the operator considered), ranging from 26,41 and 43,92% of accommodation and convergence available in each operator. The time actually spent in "near observation within the c.v.p." (Tscr) was maintained in a range from 2h 54' and 4h 05'.

  8. A comparison of the additional protocols of the five nuclear weapon states and the ensuing safeguards benefits to international nonproliferation efforts

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, Eva C [Los Alamos National Laboratory; Sandoval, M Analisa [Los Alamos National Laboratory; Sandoval, Marisa N [Los Alamos National Laboratory; Boyer, Brian D [Los Alamos National Laboratory; Leitch, Rosalyn M [Los Alamos National Laboratory

    2009-01-01

    With the 6 January 2009 entry into force of the Additional Protocol by the United States of America, all five declared Nuclear Weapon States that are part of the Nonproliferation Treaty have signed, ratified, and put into force the Additional Protocol. This paper makes a comparison of the strengths and weaknesses of the five Additional Protocols in force by the five Nuclear Weapon States with respect to the benefits to international nonproliferation aims. This paper also documents the added safeguards burden to the five declared Nuclear Weapon States that these Additional Protocols put on the states with respect to access to their civilian nuclear programs and the hosting of complementary access activities as part of the Additional Protocol.

  9. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    Science.gov (United States)

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  10. Effortful echolalia.

    Science.gov (United States)

    Hadano, K; Nakamura, H; Hamanaka, T

    1998-02-01

    We report three cases of effortful echolalia in patients with cerebral infarction. The clinical picture of speech disturbance is associated with Type 1 Transcortical Motor Aphasia (TCMA, Goldstein, 1915). The patients always spoke nonfluently with loss of speech initiative, dysarthria, dysprosody, agrammatism, and increased effort and were unable to repeat sentences longer than those containing four or six words. In conversation, they first repeated a few words spoken to them, and then produced self initiated speech. The initial repetition as well as the subsequent self initiated speech, which were realized equally laboriously, can be regarded as mitigated echolalia (Pick, 1924). They were always aware of their own echolalia and tried to control it without effect. These cases demonstrate that neither the ability to repeat nor fluent speech are always necessary for echolalia. The possibility that a lesion in the left medial frontal lobe, including the supplementary motor area, plays an important role in effortful echolalia is discussed.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  12. Training auscultatory skills: computer simulated heart sounds or additional bedside training? A randomized trial on third-year medical students

    Science.gov (United States)

    2010-01-01

    Background The present study compares the value of additional use of computer simulated heart sounds, to conventional bedside auscultation training, on the cardiac auscultation skills of 3rd year medical students at Oslo University Medical School. Methods In addition to their usual curriculum courses, groups of seven students each were randomized to receive four hours of additional auscultation training either employing a computer simulator system or adding on more conventional bedside training. Cardiac auscultation skills were afterwards tested using live patients. Each student gave a written description of the auscultation findings in four selected patients, and was rewarded from 0-10 points for each patient. Differences between the two study groups were evaluated using student's t-test. Results At the auscultation test no significant difference in mean score was found between the students who had used additional computer based sound simulation compared to additional bedside training. Conclusions Students at an early stage of their cardiology training demonstrated equal performance of cardiac auscultation whether they had received an additional short auscultation course based on computer simulated training, or had had additional bedside training. PMID:20082701

  13. Computer programs in BASIC language for graphite furnace atomic absorption using the method of additions. Part 1. Operating instructions

    International Nuclear Information System (INIS)

    Boyle, W.G. Jr.; Ryan, D.P.

    1979-01-01

    These instructions describe how to use BASIC language programs to process data from atomic absorption spectrophotometers using the graphite furnace and the method of additions calibration technique. The instructions cover loading the programs, responding to computer prompts, choosing among various options for processing the data, performing operations with an automatic sampler, and producing reports. How the programs interact with each other is also explained. Examples of computer/operator dialogue are presented for typical cases. In addition, a concise set of operating instructions is included as an appendix

  14. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  15. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  16. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    Science.gov (United States)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  17. Corrections and additions to CONTEMPT-LT computer codes for containment analysis

    International Nuclear Information System (INIS)

    Eerikaeinen, Lauri.

    1980-01-01

    The report presents a new version of CONTEMPT-LT/26 tainment code. The corrections and additions are applicable also to other CONTEMPT-LT versions. Thermodynamical background of corrections are shortly described, and in addition, some essential points which should be taken into account in the analysis of a pressure suppression containment have been pointed out. The results obtained by the corrected version have been compared to those calculated by the original program, and to the measured data in the Marviken containment experiment No 10. Finally, it has been indicated that for reliable pressure suppression analysis a wide ranging condensation model for air-steam mixture is necessary. (author)

  18. Computational prediction of the fatigue behavior of additively manufactured porous metallic biomaterials

    NARCIS (Netherlands)

    Hedayati, R.; Hosseini-Toudeshky, H; Sadighi, M.; Mohammadi-Aghdam, M; Zadpoor, A.A.

    2016-01-01

    The mechanical behavior of additively manufactured porous biomaterials has recently received increasing attention. While there is a relatively large body of data available on the static mechanical properties of such biomaterials, their fatigue behavior is not yet well-understood. That is partly

  19. Addition of visual noise boosts evoked potential-based brain-computer interface.

    Science.gov (United States)

    Xie, Jun; Xu, Guanghua; Wang, Jing; Zhang, Sicong; Zhang, Feng; Li, Yeping; Han, Chengcheng; Li, Lili

    2014-05-14

    Although noise has a proven beneficial role in brain functions, there have not been any attempts on the dedication of stochastic resonance effect in neural engineering applications, especially in researches of brain-computer interfaces (BCIs). In our study, a steady-state motion visual evoked potential (SSMVEP)-based BCI with periodic visual stimulation plus moderate spatiotemporal noise can achieve better offline and online performance due to enhancement of periodic components in brain responses, which was accompanied by suppression of high harmonics. Offline results behaved with a bell-shaped resonance-like functionality and 7-36% online performance improvements can be achieved when identical visual noise was adopted for different stimulation frequencies. Using neural encoding modeling, these phenomena can be explained as noise-induced input-output synchronization in human sensory systems which commonly possess a low-pass property. Our work demonstrated that noise could boost BCIs in addressing human needs.

  20. Efficient method for computing the maximum-likelihood quantum state from measurements with additive Gaussian noise.

    Science.gov (United States)

    Smolin, John A; Gambetta, Jay M; Smith, Graeme

    2012-02-17

    We provide an efficient method for computing the maximum-likelihood mixed quantum state (with density matrix ρ) given a set of measurement outcomes in a complete orthonormal operator basis subject to Gaussian noise. Our method works by first changing basis yielding a candidate density matrix μ which may have nonphysical (negative) eigenvalues, and then finding the nearest physical state under the 2-norm. Our algorithm takes at worst O(d(4)) for the basis change plus O(d(3)) for finding ρ where d is the dimension of the quantum state. In the special case where the measurement basis is strings of Pauli operators, the basis change takes only O(d(3)) as well. The workhorse of the algorithm is a new linear-time method for finding the closest probability distribution (in Euclidean distance) to a set of real numbers summing to one.

  1. Factor Xa generation by computational modeling: an additional discriminator to thrombin generation evaluation.

    Directory of Open Access Journals (Sweden)

    Kathleen E Brummel-Ziedins

    Full Text Available Factor (fXa is a critical enzyme in blood coagulation that is responsible for the initiation and propagation of thrombin generation. Previously we have shown that analysis of computationally generated thrombin profiles is a tool to investigate hemostasis in various populations. In this study, we evaluate the potential of computationally derived time courses of fXa generation as another approach for investigating thrombotic risk. Utilizing the case (n = 473 and control (n = 426 population from the Leiden Thrombophilia Study and each individual's plasma protein factor composition for fII, fV, fVII, fVIII, fIX, fX, antithrombin and tissue factor pathway inhibitor, tissue factor-initiated total active fXa generation was assessed using a mathematical model. FXa generation was evaluated by the area under the curve (AUC, the maximum rate (MaxR and level (MaxL and the time to reach these, TMaxR and TMaxL, respectively. FXa generation was analyzed in the entire populations and in defined subgroups (by sex, age, body mass index, oral contraceptive use. The maximum rates and levels of fXa generation occur over a 10- to 12- fold range in both cases and controls. This variation is larger than that observed with thrombin (3-6 fold in the same population. The greatest risk association was obtained using either MaxR or MaxL of fXa generation; with an ∼2.2 fold increased risk for individuals exceeding the 90(th percentile. This risk was similar to that of thrombin generation(MaxR OR 2.6. Grouping defined by oral contraceptive (OC use in the control population showed the biggest differences in fXa generation; a >60% increase in the MaxR upon OC use. FXa generation can distinguish between a subset of individuals characterized by overlapping thrombin generation profiles. Analysis of fXa generation is a phenotypic characteristic which may prove to be a more sensitive discriminator than thrombin generation among all individuals.

  2. Calculating additional shielding requirements in diagnostics X-ray departments by computer

    International Nuclear Information System (INIS)

    Rahimi, A.

    2004-01-01

    This report provides an extension of an existing method for the calculation of the barrier thickness required to reduce the three types of radiation exposure emitted from the source, the primary, secondary and leakage radiation, to a specified weekly design limit (MPD). Since each of these three types of radiation are of different beam quality, having different shielding requirements, NCRP 49 has provided means to calculate the necessary protective barrier thickness for each type of radiation individually. Additionally, barrier requirements specified using the techniques stated at NCRP 49, show enormous variations among users. Part of the variations is due to different assumptions made regarding the use of the examined room and the characteristics of adjoining space. Many of the differences result from the difficulty of accurately relating information from the calculations to graphs and tables involved in the calculation process specified by this report. Moreover, the latest technological developments such as mammography are not addressed and attenuation data for three-phase generators, that are most widely used today, is not provided. The design of shielding barriers in diagnostic X-ray departments generally follow the ALARA principle. That means that, in practice, the exposure levels are kept 'as low as reasonably achievable', taking into account economical and technical factors. Additionally, the calculation of barrier requirements includes many uncertainties (e.g. the workload, the actual kVp used etc.). (author)

  3. Design and fabrication of a sleep apnea device using computer-aided design/additive manufacture technologies.

    Science.gov (United States)

    Al Mortadi, Noor; Eggbeer, Dominic; Lewis, Jeffrey; Williams, Robert J

    2013-04-01

    The aim of this study was to analyze the latest innovations in additive manufacture techniques and uniquely apply them to dentistry, to build a sleep apnea device requiring rotating hinges. Laser scanning was used to capture the three-dimensional topography of an upper and lower dental cast. The data sets were imported into an appropriate computer-aided design software environment, which was used to design a sleep apnea device. This design was then exported as a stereolithography file and transferred for three-dimensional printing by an additive manufacture machine. The results not only revealed that the novel computer-based technique presented provides new design opportunities but also highlighted limitations that must be addressed before the techniques can become clinically viable.

  4. Computer programs in BASIC language for graphite furnace atomic absorption using the method of additions. Part 2. Documentation

    International Nuclear Information System (INIS)

    Boyle, W.G. Jr.; Ryan, D.P.

    1979-08-01

    There are four computer programs, written in the BASIC language, used for taking and processing data from an atomic absorption spectrophotometer using the graphite furnace and the method of additions for calibration. The programs chain to each other and are divided into logical sections that have been flow-charted. The chaining sequences, general features, structure, order of subroutines and functions, and the storage of data are discussed. In addition, variables are listed and defined, and a complete listing of each program with a symbol occurrence table is provided

  5. Computations on the primary photoreaction of Br2 with CO2: stepwise vs concerted addition of Br atoms.

    Science.gov (United States)

    Xu, Kewei; Korter, Timothy M; Braiman, Mark S

    2015-04-09

    It was proposed previously that Br2-sensitized photolysis of liquid CO2 proceeds through a metastable primary photoproduct, CO2Br2. Possible mechanisms for such a photoreaction are explored here computationally. First, it is shown that the CO2Br radical is not stable in any geometry. This rules out a free-radical mechanism, for example, photochemical splitting of Br2 followed by stepwise addition of Br atoms to CO2-which in turn accounts for the lack of previously observed Br2+CO2 photochemistry in gas phases. A possible alternative mechanism in liquid phase is formation of a weakly bound CO2:Br2 complex, followed by concerted photoaddition of Br2. This hypothesis is suggested by the previously published spectroscopic detection of a binary CO2:Br2 complex in the supersonically cooled gas phase. We compute a global binding-energy minimum of -6.2 kJ mol(-1) for such complexes, in a linear geometry. Two additional local minima were computed for perpendicular (C2v) and nearly parallel asymmetric planar geometries, both with binding energies near -5.4 kJ mol(-1). In these two latter geometries, C-Br and O-Br bond distances are simultaneously in the range of 3.5-3.8 Å, that is, perhaps suitable for a concerted photoaddition under the temperature and pressure conditions where Br2 + CO2 photochemistry has been observed.

  6. The Effort Paradox: Effort Is Both Costly and Valued.

    Science.gov (United States)

    Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y

    2018-04-01

    According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Designing and manufacturing an auricular prosthesis using computed tomography, 3-dimensional photographic imaging, and additive manufacturing: a clinical report.

    Science.gov (United States)

    Liacouras, Peter; Garnes, Jonathan; Roman, Norberto; Petrich, Anton; Grant, Gerald T

    2011-02-01

    The method of fabricating an auricular prosthesis by digitally positioning a mirror image of the soft tissue, then designing and using rapid prototyping to produce the mold, can reduce the steps and time needed to create a prosthesis by the traditional approach of sculpting either wax or clay. The purpose of this clinical report is to illustrate how the use of 3-dimensional (3-D) photography, computer technology, and additive manufacturing can extensively reduce many of the preliminary procedures currently used to create an auricular prosthesis. Copyright © 2011 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  8. Computer aided analysis of additional chromosome aberrations in Philadelphia chromosome positive acute lymphoblastic leukaemia using a simplified computer readable cytogenetic notation

    Directory of Open Access Journals (Sweden)

    Mohr Brigitte

    2003-01-01

    Full Text Available Abstract Background The analysis of complex cytogenetic databases of distinct leukaemia entities may help to detect rare recurring chromosome aberrations, minimal common regions of gains and losses, and also hot spots of genomic rearrangements. The patterns of the karyotype alterations may provide insights into the genetic pathways of disease progression. Results We developed a simplified computer readable cytogenetic notation (SCCN by which chromosome findings are normalised at a resolution of 400 bands. Lost or gained chromosomes or chromosome segments are specified in detail, and ranges of chromosome breakpoint assignments are recorded. Software modules were written to summarise the recorded chromosome changes with regard to the respective chromosome involvement. To assess the degree of karyotype alterations the ploidy levels and numbers of numerical and structural changes were recorded separately, and summarised in a complex karyotype aberration score (CKAS. The SCCN and CKAS were used to analyse the extend and the spectrum of additional chromosome aberrations in 94 patients with Philadelphia chromosome positive (Ph-positive acute lymphoblastic leukemia (ALL and secondary chromosome anomalies. Dosage changes of chromosomal material represented 92.1% of all additional events. Recurring regions of chromosome losses were identified. Structural rearrangements affecting (pericentromeric chromosome regions were recorded in 24.6% of the cases. Conclusions SCCN and CKAS provide unifying elements between karyotypes and computer processable data formats. They proved to be useful in the investigation of additional chromosome aberrations in Ph-positive ALL, and may represent a step towards full automation of the analysis of large and complex karyotype databases.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  11. The Effort to Reduce a Muscle Fatigue Through Gymnastics Relaxation and Ergonomic Approach for Computer Users in Central Building State University of Medan

    Science.gov (United States)

    Gultom, Syamsul; Darma Sitepu, Indra; Hasibuan, Nurman

    2018-03-01

    Fatigue due to long and continuous computer usage can lead to problems of dominant fatigue associated with decreased performance and work motivation. Specific targets in the first phase have been achieved in this research such as: (1) Identified complaints on workers using computers, using the Bourdon Wiersma test kit. (2) Finding the right relaxation & work posture draft for a solution to reduce muscle fatigue in computer-based workers. The type of research used in this study is research and development method which aims to produce the products or refine existing products. The final product is a prototype of back-holder, monitoring filter and arranging a relaxation exercise as well as the manual book how to do this while in front of the computer to lower the fatigue level for computer users in Unimed’s Administration Center. In the first phase, observations and interviews have been conducted and identified the level of fatigue on the employees of computer users at Uniemd’s Administration Center using Bourdon Wiersma test and has obtained the following results: (1) The average velocity time of respondents in BAUK, BAAK and BAPSI after working with the value of interpretation of the speed obtained value of 8.4, WS 13 was in a good enough category, (2) The average of accuracy of respondents in BAUK, in BAAK and in BAPSI after working with interpretation value accuracy obtained Value of 5.5, WS 8 was in doubt-category. This result shows that computer users experienced a significant tiredness at the Unimed Administration Center, (3) the consistency of the average of the result in measuring tiredness level on computer users in Unimed’s Administration Center after working with values in consistency of interpretation obtained Value of 5.5 with WS 8 was put in a doubt-category, which means computer user in The Unimed Administration Center suffered an extreme fatigue. In phase II, based on the results of the first phase in this research, the researcher offers

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  15. Age-related incidence of pulmonary embolism and additional pathologic findings detected by computed tomography pulmonary angiography

    Energy Technology Data Exchange (ETDEWEB)

    Groth, M., E-mail: groth.michael@googlemail.com [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Henes, F.O., E-mail: f.henes@uke.uni-hamburg.de [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Mayer, U., E-mail: mayer@uke.uni-hamburg.de [Emergency Department, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Regier, M., E-mail: m.regier@uke.uni-hamburg.de [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Adam, G., E-mail: g.adam@uke.uni-hamburg.de [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Begemann, P.G.C., E-mail: p.begemann@me.com [Center for Radiology and Endoscopy, Department of Diagnostic and Interventional Radiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany)

    2012-08-15

    Objective: To compare the incidence of pulmonary embolism (PE) and additional pathologic findings (APF) detected by computed tomography pulmonary angiography (CTPA) according to different age-groups. Materials and methods: 1353 consecutive CTPA cases for suspected PE were retrospectively reviewed. Patients were divided into seven age groups: {<=}29, 30-39, 40-49, 50-59, 60-69, 70-79 and {>=}80 years. Differences between the groups were tested using Fisher's exact or chi-square test. A p-value < 0.0024 indicated statistical significance when Bonferroni correction was used. Results: Incidence rates of PE ranged from 11.4% to 25.4% in different age groups. The three main APF were pleural effusion, pneumonia and pulmonary nodules. No significant difference was found between the incidences of PE in different age groups. Furthermore, APF in different age groups revealed no significant differences (all p-values > 0.0024). Conclusion: The incidences of PE and APF detected by CTPA reveal no significant differences between various age groups.

  16. Developing computer systems to support emergency operations: Standardization efforts by the Department of Energy and implementation at the DOE Savannah River Site

    International Nuclear Information System (INIS)

    DeBusk, R.E.; Fulton, G.J.; O'Dell, J.J.

    1990-01-01

    This paper describes the development of standards for emergency operations computer systems for the US Department of Energy (DOE). The proposed DOE computer standards prescribe the necessary power and simplicity to meet the expanding needs of emergency managers. Standards include networked UNIX workstations based on the client server model and software that presents information graphically using icons and windowing technology. DOE standards are based on those of the computer industry although Proposed DOE is implementing the latest technology to ensure a solid base for future growth. A case of how these proposed standards are being implemented is also presented. The Savannah River Site (SRS), a DOE facility near Aiken, South Carolina is automating a manual information system, proven over years of development. This system is generalized as a model that can apply to most, if not all, Emergency Operations Centers. This model can provide timely and validated information to emergency managers. By automating this proven system, the system is made easier to use. As experience in the case study demonstrates, computers are only an effective information tool when used as part of a proven process

  17. Literality and Cognitive Effort

    DEFF Research Database (Denmark)

    Lacruz, Isabel; Carl, Michael; Yamada, Masaru

    2018-01-01

    We introduce a notion of pause-word ratio computed using ranges of pause lengths rather than lower cutoffs for pause lengths. Standard pause-word ratios are indicators of cognitive effort during different translation modalities.The pause range version allows for the study of how different types...... remoteness. We use data from the CRITT TPR database, comparing translation and post-editing from English to Japanese and from English to Spanish, and study the interaction of pause-word ratio for short pauses ranging between 300 and 500ms with syntactic remoteness, measured by the CrossS feature, semantic...... remoteness, measured by HTra, and syntactic and semantic remoteness, measured by Literality....

  18. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  20. Ab initio quasi-particle approximation bandgaps of silicon nanowires calculated at density functional theory/local density approximation computational effort

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, M., E-mail: ribeiro.jr@oorbit.com.br [Office of Operational Research for Business Intelligence and Technology, Principal Office, Buffalo, Wyoming 82834 (United States)

    2015-06-21

    Ab initio calculations of hydrogen-passivated Si nanowires were performed using density functional theory within LDA-1/2, to account for the excited states properties. A range of diameters was calculated to draw conclusions about the ability of the method to correctly describe the main trends of bandgap, quantum confinement, and self-energy corrections versus the diameter of the nanowire. Bandgaps are predicted with excellent accuracy if compared with other theoretical results like GW, and with the experiment as well, but with a low computational cost.

  1. Ab initio quasi-particle approximation bandgaps of silicon nanowires calculated at density functional theory/local density approximation computational effort

    International Nuclear Information System (INIS)

    Ribeiro, M.

    2015-01-01

    Ab initio calculations of hydrogen-passivated Si nanowires were performed using density functional theory within LDA-1/2, to account for the excited states properties. A range of diameters was calculated to draw conclusions about the ability of the method to correctly describe the main trends of bandgap, quantum confinement, and self-energy corrections versus the diameter of the nanowire. Bandgaps are predicted with excellent accuracy if compared with other theoretical results like GW, and with the experiment as well, but with a low computational cost

  2. A synergistic effort among geoscience, physics, computer science and mathematics at Hunter College of CUNY as a Catalyst for educating Earth scientists.

    Science.gov (United States)

    Salmun, H.; Buonaiuto, F. S.

    2016-12-01

    The Catalyst Scholarship Program at Hunter College of The City University of New York (CUNY) was established with a four-year award from the National Science Foundation (NSF) to fund scholarships for academically talented but financially disadvantaged students majoring in four disciplines of science, technology, engineering and mathematics (STEM). Led by Earth scientists the Program awarded scholarships to students in their junior or senior years majoring in computer science, geosciences, mathematics and physics to create two cohorts of students that spent a total of four semesters in an interdisciplinary community. The program included mentoring of undergraduate students by faculty and graduate students (peer-mentoring), a sequence of three semesters of a one-credit seminar course and opportunities to engage in research activities, research seminars and other enriching academic experiences. Faculty and peer-mentoring were integrated into all parts of the scholarship activities. The one-credit seminar course, although designed to expose scholars to the diversity STEM disciplines and to highlight research options and careers in these disciplines, was thematically focused on geoscience, specifically on ocean and atmospheric science. The program resulted in increased retention rates relative to institutional averages. In this presentation we will discuss the process of establishing the program, from the original plans to its implementation, as well as the impact of this multidisciplinary approach to geoscience education at our institution and beyond. An overview of accomplishments, lessons learned and potential for best practices will be presented.

  3. Development of additional module to neutron-physic and thermal-hydraulic computer codes for coolant acoustical characteristics calculation

    Energy Technology Data Exchange (ETDEWEB)

    Proskuryakov, K.N.; Bogomazov, D.N.; Poliakov, N. [Moscow Power Engineering Institute (Technical University), Moscow (Russian Federation)

    2007-07-01

    The new special module to neutron-physic and thermal-hydraulic computer codes for coolant acoustical characteristics calculation is worked out. The Russian computer code Rainbow has been selected for joint use with a developed module. This code system provides the possibility of EFOCP (Eigen Frequencies of Oscillations of the Coolant Pressure) calculations in any coolant acoustical elements of primary circuits of NPP. EFOCP values have been calculated for transient and for stationary operating. The calculated results for nominal operating were compared with results of measured EFOCP. For example, this comparison was provided for the system: 'pressurizer + surge line' of a WWER-1000 reactor. The calculated result 0.58 Hz practically coincides with the result of measurement (0.6 Hz). The EFOCP variations in transients are also shown. The presented results are intended to be useful for NPP vibration-acoustical certification. There are no serious difficulties for using this module with other computer codes.

  4. Development of additional module to neutron-physic and thermal-hydraulic computer codes for coolant acoustical characteristics calculation

    International Nuclear Information System (INIS)

    Proskuryakov, K.N.; Bogomazov, D.N.; Poliakov, N.

    2007-01-01

    The new special module to neutron-physic and thermal-hydraulic computer codes for coolant acoustical characteristics calculation is worked out. The Russian computer code Rainbow has been selected for joint use with a developed module. This code system provides the possibility of EFOCP (Eigen Frequencies of Oscillations of the Coolant Pressure) calculations in any coolant acoustical elements of primary circuits of NPP. EFOCP values have been calculated for transient and for stationary operating. The calculated results for nominal operating were compared with results of measured EFOCP. For example, this comparison was provided for the system: 'pressurizer + surge line' of a WWER-1000 reactor. The calculated result 0.58 Hz practically coincides with the result of measurement (0.6 Hz). The EFOCP variations in transients are also shown. The presented results are intended to be useful for NPP vibration-acoustical certification. There are no serious difficulties for using this module with other computer codes

  5. Effort Estimation in BPMS Migration

    OpenAIRE

    Drews, Christopher; Lantow, Birger

    2018-01-01

    Usually Business Process Management Systems (BPMS) are highly integrated in the IT of organizations and are at the core of their business. Thus, migrating from one BPMS solution to another is not a common task. However, there are forces that are pushing organizations to perform this step, e.g. maintenance costs of legacy BPMS or the need for additional functionality. Before the actual migration, the risk and the effort must be evaluated. This work provides a framework for effort estimation re...

  6. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  7. On some additional recollections, and the absence thereof, about the early history of computer simulations in statistical mechanics

    International Nuclear Information System (INIS)

    Wood, W.W.

    1995-01-01

    This lecture is an extension and correction of a previous lecture given by the author ten years ago at ''Corso 97'' in Varenna. Here again he emphasizes that his early work was exclusively with applications of the Metropolis Monte Carlo method. His only connection with the early work on the molecular dynamics method was in collaboration with Alder and Wainwright in their joint effort to reconcile the results of the Monte Carlo and molecular dynamics methods for hard spheres. Here he amplifies a point suggested by a question asked by Professor Ciccotti: Namely, when was it discovered that the Metropolis method consists in the generation of a realization of a Markov chain, for which there was a large body of mathematical theory that made the justification of the method quite a simple matter?

  8. Prescribed computer games in addition to occlusion versus standard occlusion treatment for childhood amblyopia: a pilot randomised controlled trial.

    Science.gov (United States)

    Tailor, Vijay K; Glaze, Selina; Khandelwal, Payal; Davis, Alison; Adams, Gillian G W; Xing, Wen; Bunce, Catey; Dahlmann-Noor, Annegret

    2015-01-01

    Amblyopia ("lazy eye") is the commonest vision deficit in children. If not fully corrected by glasses, amblyopia is treated by patching or blurring the better-seeing eye. Compliance with patching is often poor. Computer-based activities are increasingly topical, both as an adjunct to standard treatment and as a platform for novel treatments. Acceptability by families has not been explored, and feasibility of a randomised controlled trial (RCT) using computer games in terms of recruitment and treatment acceptability is uncertain. We carried out a pilot RCT to test whether computer-based activities are acceptable and accessible to families and to test trial methods such as recruitment and retention rates, randomisation, trial-specific data collection tools and analysis. The trial had three arms: standard near activity advice, Eye Five, a package developed for children with amblyopia, and an off-the-shelf handheld games console with pre-installed games. We enrolled 60 children age 3-8 years with moderate or severe amblyopia after completion of optical treatment. This trial was registered as UKCRN-ID 11074. Pre-screening of 3600 medical notes identified 189 potentially eligible children, of whom 60 remained eligible after optical treatment, and were enrolled between April 2012 and March 2013. One participant was randomised twice and withdrawn from the study. Of the 58 remaining, 37 were boys. The mean (SD) age was 4.6 (1.7) years. Thirty-seven had moderate and 21 severe amblyopia. Three participants were withdrawn at week 6, and in total, four were lost to follow-up at week 12. Most children and parents/carers found the study procedures, i.e. occlusion treatment, usage of the allocated near activity and completion of a study diary, easy. The prescribed cumulative dose of near activity was 84 h at 12 weeks. Reported near activity usage numbers were close to prescribed numbers in moderate amblyopes (94 % of prescribed) but markedly less in severe amblyopes (64

  9. Polylactides in additive biomanufacturing.

    Science.gov (United States)

    Poh, Patrina S P; Chhaya, Mohit P; Wunner, Felix M; De-Juan-Pardo, Elena M; Schilling, Arndt F; Schantz, Jan-Thorsten; van Griensven, Martijn; Hutmacher, Dietmar W

    2016-12-15

    New advanced manufacturing technologies under the alias of additive biomanufacturing allow the design and fabrication of a range of products from pre-operative models, cutting guides and medical devices to scaffolds. The process of printing in 3 dimensions of cells, extracellular matrix (ECM) and biomaterials (bioinks, powders, etc.) to generate in vitro and/or in vivo tissue analogue structures has been termed bioprinting. To further advance in additive biomanufacturing, there are many aspects that we can learn from the wider additive manufacturing (AM) industry, which have progressed tremendously since its introduction into the manufacturing sector. First, this review gives an overview of additive manufacturing and both industry and academia efforts in addressing specific challenges in the AM technologies to drive toward AM-enabled industrial revolution. After which, considerations of poly(lactides) as a biomaterial in additive biomanufacturing are discussed. Challenges in wider additive biomanufacturing field are discussed in terms of (a) biomaterials; (b) computer-aided design, engineering and manufacturing; (c) AM and additive biomanufacturing printers hardware; and (d) system integration. Finally, the outlook for additive biomanufacturing was discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Comparability of the performance of in-line computer vision for geometrical verification of parts, produced by Additive Manufacturing

    DEFF Research Database (Denmark)

    Pedersen, David B.; Hansen, Hans N.

    2014-01-01

    The field of Additive Manufacturing is growing at an accelerated rate, as prototyping is left in favor of direct manufacturing of components for the industry and consumer. A consequence of masscustomization and component complexity is an adverse geometrical verification challenge. Mass...

  11. Improved operating scenarios of the DIII-D tokamak as a result of the addition of UNIX computer systems

    International Nuclear Information System (INIS)

    Henline, P.A.

    1995-10-01

    The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape control due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described

  12. Experimental and computational approaches to evaluate the environmental mitigation effect in narrow spaces by noble metal chemical addition (NMCA)

    International Nuclear Information System (INIS)

    Shimizu, Ryosuke; Ota, Nobuyuki; Nagase, Makoto; Aizawa, Motohiro; Ishida, Kazushige; Wada, Yoichi

    2014-01-01

    The environmental mitigation effect of NMCA in a narrow space was evaluated by experimental and computational approaches. In the experiment at 8 MPa and 553K, T-tube whose branched line had a narrow space was prepared, and the Zr electrodes were set in the branched line at certain intervals, which were 1, 3, 5, 7, 9, 11, 15 and 29 cm from the opening section of the branched line. Electrochemical corrosion potential (ECP) at the tip of the branched narrow space varied in response to the water chemistry in the main line which was at right angle with the branched line. Computational fluid dynamics (CFD) analysis reproduced the experimental results. It was also confirmed by CFD analysis that the ingress of water from the main line into the narrow space was accelerated by cavity flow and thermal convection. By CFD analysis in a thermal sleeve of actual plant condition, which had a narrow space, the concentration of dissolved oxygen at a tip of the thermal sleeve reached at 250 ppb within 300 sec, which was the same concentration of the main line. Noble metal deposition on the surface of the thermal sleeve was evaluated by mass transfer model. Noble metal deposition was the largest near the opening section of the branched line, and gradually decreased toward the tip section. In light of the consumption of dissolved oxygen in the branched line, noble metal deposition in the thermal sleeve was sufficient to reduce the ECP. It was expected that NMCA could mitigate the corrosion environment in the thermal sleeve. (author)

  13. Computer modeling of inhibition of α-radiolysis of water by H2 addition (NPC 2012 conference)

    International Nuclear Information System (INIS)

    Lertnaisat, Phantira; Katsumura, Yosuke; Mukai, Satoru; Umehara, Ryuji; Shimizu, Yuichi; Suzuki, Masaru

    2012-09-01

    It is known that α-radiolysis of water produces H 2 gas continuously. The addition of H 2 to water inhibits the water decomposition; H 2 evolution. In order to suppress the water decomposition, 25 cc H 2 STP/kg-H 2 O is added to the coolant water in PWR. However, the exact inhibition mechanism is still not made clear yet. In this project, the chemical kinetic simulation program, so called FASCIMILE, was used to reproduce the suppression of α-radiolysis of water by H 2 addition. By using three important factors; the decomposition (G-value), the reaction set and rate constants, and the dose rate, it is found that without hydrogen addition, the simulation shows the almost linear increase of molecular products; H 2 , H 2 O 2 , and O 2 . Nevertheless, as the additional hydrogen is added to the system, this behaviour of linear increase is shifted to longer time period. And up to certain concentration, the linear increase behaviour is completely suppressed and the molecular products reach the steady state condition at early time period and much lower concentration. The minimum concentration of H 2 which could completely suppress the decomposition of water is called Critical Hydrogen Concentration (CHC) and it is dose rate dependent value. The CHC is found to be dependent on the reaction set and rate constants. The simulation results show that the CHC at room temperature and dose rate of 1 kGy/s of the simulation done by using reaction set and rate constants obtained from Ershov et al. and AECL report 2009 are 165μM and 146μM, respectively. From the change of the behaviour of molecular products after reaching the CHC, the possible mechanism is proposed. First, the OH radical are formed via the reaction of H + H 2 O 2 → OH + H 2 O and e - aq + H 2 O 2 → OH+OH - . Then OH, which normally will react with H 2 O 2 to produced HO 2 , will react with the additional H 2 , which produce H to continue the chain reaction. The relation of chain reaction to the suppression of

  14. ICRP new recommendations. Committee 2's efforts

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    The International Commission on Radiological Protection (ICRP) may release new primary radiation protection recommendation in 2007. Committee 2 has underway reviews of the dosimetric and biokinetic models and associated data used in calculating dose coefficients for intakes of radionuclides and exposures to external radiation fields. This paper outlines the work plans of Committee 2 during the current term, 2005-2009, in anticipation of the new primary recommendations. The two task groups of Committee 2 responsible for the computations of dose coefficients, INDOS and DOCAL, are reviewing the models and data used in the computations. INDOS is reviewing the lung model and the biokinetic models that describe the behavior of the radionuclides in the body. DOCAL is reviewing its computational formulations with the objective of harmonizing the formulation with those of nuclear medicine, and developing new computational phantoms representing the adult male and female reference individuals of ICRP Publication 89. In addition, DOCAL will issue a publication on nuclear decay data to replace ICRP Publication 38. While the current efforts are focused on updating the dose coefficients for occupational intakes of radionuclides plans are being formulated to address dose coefficients for external radiation fields which include consideration of high energy fields associated with accelerators and space travel and the updating of dose coefficients for members of the public. (author)

  15. Effort in Multitasking: Local and Global Assessment of Effort.

    Science.gov (United States)

    Kiesel, Andrea; Dignath, David

    2017-01-01

    When performing multiple tasks in succession, self-organization of task order might be superior compared to external-controlled task schedules, because self-organization allows optimizing processing modes and thus reduces switch costs, and it increases commitment to task goals. However, self-organization is an additional executive control process that is not required if task order is externally specified and as such it is considered as time-consuming and effortful. To compare self-organized and externally controlled task scheduling, we suggest assessing global subjective and objectives measures of effort in addition to local performance measures. In our new experimental approach, we combined characteristics of dual tasking settings and task switching settings and compared local and global measures of effort in a condition with free choice of task sequence and a condition with cued task sequence. In a multi-tasking environment, participants chose the task order while the task requirement of the not-yet-performed task remained the same. This task preview allowed participants to work on the previously non-chosen items in parallel and resulted in faster responses and fewer errors in task switch trials than in task repetition trials. The free-choice group profited more from this task preview than the cued group when considering local performance measures. Nevertheless, the free-choice group invested more effort than the cued group when considering global measures. Thus, self-organization in task scheduling seems to be effortful even in conditions in which it is beneficiary for task processing. In a second experiment, we reduced the possibility of task preview for the not-yet-performed tasks in order to hinder efficient self-organization. Here neither local nor global measures revealed substantial differences between the free-choice and a cued task sequence condition. Based on the results of both experiments, we suggest that global assessment of effort in addition to

  16. Additional value of computer assisted semen analysis (CASA) compared to conventional motility assessments in pig artificial insemination.

    Science.gov (United States)

    Broekhuijse, M L W J; Soštarić, E; Feitsma, H; Gadella, B M

    2011-11-01

    In order to obtain a more standardised semen motility evaluation, Varkens KI Nederland has introduced a computer assisted semen analysis (CASA) system in all their pig AI laboratories. The repeatability of CASA was enhanced by standardising for: 1) an optimal sample temperature (39 °C); 2) an optimal dilution factor; 3) optimal mixing of semen and dilution buffer by using mechanical mixing; 4) the slide chamber depth, and together with the previous points; 5) the optimal training of technicians working with the CASA system; and 6) the use of a standard operating procedure (SOP). Once laboratory technicians were trained in using this SOP, they achieved a coefficient of variation of CASA. CASA results are preferable as accurate continuous motility dates are generated rather than discrimination motility percentage increments of 10% motility as with motility estimation by laboratory technicians. The higher variability of sperm motility found with CASA and the continuous motility values allow better analysis of the relationship between semen motility characteristics and fertilising capacity. The benefits of standardised CASA for AI is discussed both with respect to estimate the correct dilution factor of the ejaculate for the production of artificial insemination (AI) doses (critical for reducing the number of sperm per AI doses) and thus to get more reliable fertility data from these AI doses in return. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. runjags: An R Package Providing Interface Utilities, Model Templates, Parallel Computing Methods and Additional Distributions for MCMC Models in JAGS

    Directory of Open Access Journals (Sweden)

    Matthew J. Denwood

    2016-07-01

    Full Text Available The runjags package provides a set of interface functions to facilitate running Markov chain Monte Carlo models in JAGS from within R. Automated calculation of appropriate convergence and sample length diagnostics, user-friendly access to commonly used graphical outputs and summary statistics, and parallelized methods of running JAGS are provided. Template model specifications can be generated using a standard lme4-style formula interface to assist users less familiar with the BUGS syntax. Automated simulation study functions are implemented to facilitate model performance assessment, as well as drop-k type cross-validation studies, using high performance computing clusters such as those provided by parallel. A module extension for JAGS is also included within runjags, providing the Pareto family of distributions and a series of minimally-informative priors including the DuMouchel and half-Cauchy priors. This paper outlines the primary functions of this package, and gives an illustration of a simulation study to assess the sensitivity of two equivalent model formulations to different prior distributions.

  18. Additional benefit of 18F-fluorodeoxyglucose integrated positron emission tomography/computed tomography in the staging of oesophageal cancer

    International Nuclear Information System (INIS)

    Gillies, R.S.; Middleton, M.R.; Maynard, N.D.; Bradley, K.M.; Gleeson, F.V.

    2011-01-01

    18 F-fluorodeoxyglucose positron emission tomography (FDG PET) has been shown to improve the accuracy of staging in oesophageal cancer. We assessed the benefit of PET/CT over conventional staging and determined if tumour histology had any significant impact on PET/CT findings. A retrospective cohort study, reviewing the results from 200 consecutive patients considered suitable for radical treatment, undergoing routine PET/CT staging comparing the results from CT and endoscopic ultrasound, as well as multi-disciplinary team records. Adenocarcinoma and squamous cell carcinoma were compared for maximum Standardised Uptake Value (SUV max ), involvement of local lymph nodes and distant metastases. PET/CT provided additional information in 37 patients (18.5%) and directly altered management in 34 (17%): 22 (11%) were upstaged; 15 (7.5%) were downstaged, 12 of whom (6%) received radical treatment. There were 11 false negatives (5.5%) and 1 false positive (0.5%). SUV max was significantly lower for adenocarcinoma than squamous cell carcinoma (median 9.1 versus 13.5, p = 0.003). Staging with PET/CT offers additional benefit over conventional imaging and should form part of routine staging for oesophageal cancer. Adenocarcinoma and squamous cell carcinoma display significantly different FDG-avidity. (orig.)

  19. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    Science.gov (United States)

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  20. An additional reference axis improves femoral rotation alignment in image-free computer navigation assisted total knee arthroplasty.

    Science.gov (United States)

    Inui, Hiroshi; Taketomi, Shuji; Nakamura, Kensuke; Sanada, Takaki; Tanaka, Sakae; Nakagawa, Takumi

    2013-05-01

    Few studies have demonstrated improvement in accuracy of rotational alignment using image-free navigation systems mainly due to the inconsistent registration of anatomical landmarks. We have used an image-free navigation for total knee arthroplasty, which adopts the average algorithm between two reference axes (transepicondylar axis and axis perpendicular to the Whiteside axis) for femoral component rotation control. We hypothesized that addition of another axis (condylar twisting axis measured on a preoperative radiograph) would improve the accuracy. One group using the average algorithm (double-axis group) was compared with the other group using another axis to confirm the accuracy of the average algorithm (triple-axis group). Femoral components were more accurately implanted for rotational alignment in the triple-axis group (ideal: triple-axis group 100%, double-axis group 82%, P<0.05). Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Efforts by Atomic Energy Society of Japan to improve the public understanding on nuclear power. With an additional review on the present status of nuclear engineering education at universities

    International Nuclear Information System (INIS)

    Nishina, K.; Kudo, K.; Ishigure, K.; Miyazaki, K.; Kimura, I.; Madarame, H.

    1996-01-01

    On variety of recent public occasions crucial for the progress of Japanese nuclear fuel cycle, the public has expressed their incredulous and reserved attitudes toward further expansion of nuclear power utilization. The typical examples are (1) local town political votes with an issue to decide on the acceptance of a proposed nuclear power plant, ending up with a conclusion against the proposal, and (2) the local dissatisfaction expressed against a proposed, deep-underground research facility, which is intended to produce cold-simulation data on the behavior of high level waste nuclides. Realizing that the dissemination of systematic and correct informations is indispensable for gaining public understanding on the importance of energy resources and nuclear power, the Educational Committee, Atomic Energy Society of Japan (AESJ), has initiated various public relations activities since 1994. In the following we sketch such activities, namely: (1) Reviews conducted on high school textbooks. (2) A request submitted to the Government for revisions of high-school textbooks and governmental guidelines defining these textbooks. (3) Preparations of a source book on nuclear energy and radiations. In addition, (4) the review conducted on the present status of nuclear engineering education in universities over the country with and without nuclear engineering program will be given. (author)

  2. Estimation of inspection effort

    International Nuclear Information System (INIS)

    Mullen, M.F.; Wincek, M.A.

    1979-06-01

    An overview of IAEA inspection activities is presented, and the problem of evaluating the effectiveness of an inspection is discussed. Two models are described - an effort model and an effectiveness model. The effort model breaks the IAEA's inspection effort into components; the amount of effort required for each component is estimated; and the total effort is determined by summing the effort for each component. The effectiveness model quantifies the effectiveness of inspections in terms of probabilities of detection and quantities of material to be detected, if diverted over a specific period. The method is applied to a 200 metric ton per year low-enriched uranium fuel fabrication facility. A description of the model plant is presented, a safeguards approach is outlined, and sampling plans are calculated. The required inspection effort is estimated and the results are compared to IAEA estimates. Some other applications of the method are discussed briefly. Examples are presented which demonstrate how the method might be useful in formulating guidelines for inspection planning and in establishing technical criteria for safeguards implementation

  3. Effort Estimation in BPMS Migration

    Directory of Open Access Journals (Sweden)

    Christopher Drews

    2018-04-01

    Full Text Available Usually Business Process Management Systems (BPMS are highly integrated in the IT of organizations and are at the core of their business. Thus, migrating from one BPMS solution to another is not a common task. However, there are forces that are pushing organizations to perform this step, e.g. maintenance costs of legacy BPMS or the need for additional functionality. Before the actual migration, the risk and the effort must be evaluated. This work provides a framework for effort estimation regarding the technical aspects of BPMS migration. The framework provides questions for BPMS comparison and an effort evaluation schema. The applicability of the framework is evaluated based on a simplified BPMS migration scenario.

  4. The effect of sleep loss on next day effort.

    Science.gov (United States)

    Engle-Friedman, Mindy; Riela, Suzanne; Golan, Rama; Ventuneac, Ana M; Davis, Christine M; Jefferson, Angela D; Major, Donna

    2003-06-01

    The study had two primary objectives. The first was to determine whether sleep loss results in a preference for tasks demanding minimal effort. The second was to evaluate the quality of performance when participants, under conditions of sleep loss, have control over task demands. In experiment 1, using a repeated-measures design, 50 undergraduate college students were evaluated, following one night of no sleep loss and one night of sleep loss. The Math Effort Task (MET) presented addition problems via computer. Participants were able to select additions at one of five levels of difficulty. Less-demanding problems were selected and more additions were solved correctly when the participants were subject to sleep loss. In experiment 2, 58 undergraduate college students were randomly assigned to a no sleep deprivation or a sleep deprivation condition. Sleep-deprived participants selected less-demanding problems on the MET. Percentage correct on the MET was equivalent for both the non-sleep-deprived and sleep-deprived groups. On a task selection question, the sleep-deprived participants also selected significantly less-demanding non-academic tasks. Increased sleepiness, fatigue, and reaction time were associated with the selection of less difficult tasks. Both groups of participants reported equivalent effort expenditures; sleep-deprived participants did not perceive a reduction in effort. These studies demonstrate that sleep loss results in the choice of low-effort behavior that helps maintain accurate responding.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  6. Telecommunications: Additional Federal Efforts Could Help Advance Digital Television Transition

    Science.gov (United States)

    2002-11-01

    The transition to broadcast digital television(DTV) will provide new television services and the improved picture quality of 'high definition television'. It will also allow some portions of the radiofrequency spectrum used for broadcasting to be returned for public safety and commercial uses. The Congress set December 2006 as the target date for completing the DTV transition and turning the analog broadcast signals. However, this date can be extended if fewer than 85 percent of households in a market are able to receive the digital signals. GAO (General Accounting Office) was asked to assess issues related to the DTV transition.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. Relationship between lung-to-heart uptake ratio of technetium-99m-tetrofosmin during exercise myocardial single photon emission computed tomographic imaging and the number of diseased coronary arteries in patients with effort angina pectoris without myocardial infarction

    International Nuclear Information System (INIS)

    Okajima, Toshiya; Ueshima, Kenji; Nishiyama, Osamu; Ogawa, Muneyoshi; Ohuchi, Mami; Saitoh, Masahiko; Hiramori, Katsuhiko

    2004-01-01

    Increased lung uptake of thallium-201 in exercise myocardial perfusion imaging is a reliable marker of multivessel disease in patients with ischemic heart disease. This study investigated whether the lung-to-heart uptake ratio with technetium-99m ( 99m Tc)-tetrofosmin also provides valuable information to detect patients with multivessel disease. Fifty-three consecutive patients (35 men, 18 women, mean age 66±11 years; single-vessel disease: 29, double-vessel disease: 16, triple-vessel disease: 8) with stable effort angina pectoris without prior myocardial infarction and 17 control subjects (12 men, 5 women, mean age 62±9 years) underwent exercise myocardial perfusion imaging with 99m Tc-tetrofosmin and coronary angiography in January 2000 to December 2002. The lung-to-heart uptake ratio was calculated on an anterior projection before reconstruction of the exercise single photon emission computed tomographic images. The mean lung-to-heart uptake ratio was 0.34±0.04, 0.38±0.07, 0.41±0.05, and 0.46±0.09, in patients with normal coronary, single-vessel disease, double-vessel disease, and triple-vessel disease, respectively. Significantly higher lung-to-heart uptake ratio was associated with more diseased vessels (p 99m Tc-tetrofosmin can provide clinically useful information to detect multivessel disease in patients with ischemic heart disease. (author)

  11. The role of additional computed tomography in the decision-making process on the secondary prevention in patients after systemic cerebral thrombolysis

    Directory of Open Access Journals (Sweden)

    Sobolewski P

    2015-12-01

    Full Text Available Piotr Sobolewski,1 Grzegorz Kozera,2 Wiktor Szczuchniak,1 Walenty M Nyka2 1Department of Neurology and Stroke, Unit of Holy Spirit Specialist Hospital in Sandomierz, Sandomierz, Poland; 2Department of Neurology, Medical University of Gdańsk, Gdańsk, Poland Introduction: Patients with ischemic stroke undergoing intravenous (iv-thrombolysis are routinely controlled with computed tomography on the second day to assess stroke evolution and hemorrhagic transformation (HT. However, the benefits of an additional computed tomography (aCT performed over the next days after iv-thrombolysis have not been determined.Methods: We retrospectively screened 287 Caucasian patients with ischemic stroke who were consecutively treated with iv-thrombolysis from 2008 to 2012. The results of computed tomography performed on the second (control computed tomography and seventh (aCT day after iv-thrombolysis were compared in 274 patients (95.5%; 13 subjects (4.5%, who died before the seventh day from admission were excluded from the analysis.Results: aCTs revealed a higher incidence of HT than control computed tomographies (14.2% vs 6.6%; P=0.003. Patients with HT in aCT showed higher median of National Institutes of Health Stroke Scale score on admission than those without HT (13.0 vs 10.0; P=0.01 and higher presence of ischemic changes >1/3 middle cerebral artery territory (66.7% vs 35.2%; P<0.01. Correlations between presence of HT in aCT and National Institutes of Health Stroke Scale score on admission (rpbi 0.15; P<0.01, and the ischemic changes >1/3 middle cerebral artery (phi=0.03 existed, and the presence of HT in aCT was associated with 3-month mortality (phi=0.03.Conclusion: aCT after iv-thrombolysis enables higher detection of HT, which is related to higher 3-month mortality. Thus, patients with severe middle cerebral artery infarction may benefit from aCT in the decision-making process on the secondary prophylaxis. Keywords: ischemic stroke, iv

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  13. SIGMA without effort

    International Nuclear Information System (INIS)

    Hagedorn, R.; Reinfelds, J.

    1978-01-01

    SIGMA (System for Interactive Graphical Analysis) is an interactive computing language with automatic array handling and graphical facilities. It is designed as a tool for mathematical problem solving. The SIGMA language is simple, almost obvious, yet flexible and powerful. This tutorial introduces the beginner to SIGMA. It is supposed to be used at a graphics terminal having access to SIGMA. The user will learn the language in dialogue with the system in sixteen sessions of about one hour. The first session enables him already to compute and display functions of one or two variables. (Auth.)

  14. Dataset of calcified plaque condition in the stenotic coronary artery lesion obtained using multidetector computed tomography to indicate the addition of rotational atherectomy during percutaneous coronary intervention

    Directory of Open Access Journals (Sweden)

    Yasushi Akutsu

    2016-06-01

    Full Text Available Our data shows the regional coronary artery calcium scores (lesion CAC on multidetector computed tomography (MDCT and the cross-section imaging on MDCT angiography (CTA in the target lesion of the patients with stable angina pectoris who were scheduled for percutaneous coronary intervention (PCI. CAC and CTA data were measured using a 128-slice scanner (Somatom Definition AS+; Siemens Medical Solutions, Forchheim, Germany before PCI. CAC was measured in a non-contrast-enhanced scan and was quantified using the Calcium Score module of SYNAPSE VINCENT software (Fujifilm Co. Tokyo, Japan and expressed in Agatston units. CTA were then continued with a contrast-enhanced ECG gating to measure the severity of the calcified plaque condition. We present that both CAC and CTA data are used as a benchmark to consider the addition of rotational atherectomy during PCI to severely calcified plaque lesions.

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. A computational study of the addition of ReO3L (L = Cl(-), CH3, OCH3 and Cp) to ethenone.

    Science.gov (United States)

    Aniagyei, Albert; Tia, Richard; Adei, Evans

    2016-01-01

    The periselectivity and chemoselectivity of the addition of transition metal oxides of the type ReO3L (L = Cl, CH3, OCH3 and Cp) to ethenone have been explored at the MO6 and B3LYP/LACVP* levels of theory. The activation barriers and reaction energies for the stepwise and concerted addition pathways involving multiple spin states have been computed. In the reaction of ReO3L (L = Cl(-), OCH3, CH3 and Cp) with ethenone, the concerted [2 + 2] addition of the metal oxide across the C=C and C=O double bond to form either metalla-2-oxetane-3-one or metalla-2,4-dioxolane is the most kinetically favored over the formation of metalla-2,5-dioxolane-3-one from the direct [3 + 2] addition pathway. The trends in activation and reaction energies for the formation of metalla-2-oxetane-3-one and metalla-2,4-dioxolane are Cp Cp Cp Cp Cp. The direct [2 + 2] addition pathways leading to the formations of metalla-2-oxetane-3-one and metalla-2,4-dioxolane is thermodynamically the most favored for the ligands L = OCH3 and Cl(-). The difference between the calculated [2 + 2] activation barriers for the addition of the metal oxide LReO3 across the C=C and C=O functionalities of ethenone are small except for the case of L = Cl(-) and OCH3. The rearrangement of the metalla-2-oxetane-3-one-metalla-2,5-dioxolane-3-one even though feasible, are unfavorable due to high activation energies of their rate-determining steps. For the rearrangement of the metalla-2-oxetane-3-one to metalla-2,5-dioxolane-3-one, the trends in activation barriers is found to follow the order OCH3 Cp. The trends in the activation energies for the most favorable [2 + 2] addition pathways for the LReO3-ethenone system is CH3 > CH3O(-) > Cl(-) > Cp. For the analogous ethylene-LReO3 system, the trends in activation and reaction energies for the most favorable [3 + 2] addition pathway is CH3 > CH3O(-) > Cl(-) > Cp [10]. Even though the most favored pathway in the ethylene-LReO3 system is

  20. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  1. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  3. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1982-09-01

    Report III, Volume 1 contains those specifications numbered A through J, as follows: General Specifications (A); Specifications for Pressure Vessels (C); Specifications for Tanks (D); Specifications for Exchangers (E); Specifications for Fired Heaters (F); Specifications for Pumps and Drivers (G); and Specifications for Instrumentation (J). The standard specifications of Bechtel Petroleum Incorporated have been amended as necessary to reflect the specific requirements of the Breckinridge Project, and the more stringent specifications of Ashland Synthetic Fuels, Inc. These standard specifications are available to the Initial Effort (Phase Zero) work performed by all contractors and subcontractors. Report III, Volume 1 also contains the unique specifications prepared for Plants 8, 15, and 27. These specifications will be substantially reviewed during Phase I of the project, and modified as necessary for use during the engineering, procurement, and construction of this project.

  4. Mapping telemedicine efforts

    DEFF Research Database (Denmark)

    Kierkegaard, Patrick

    2015-01-01

    are being utilized? What medical disciplines are being addressed using telemedicine systems? Methods: All data was surveyed from the "Telemedicinsk Landkort", a newly created database designed to provide a comprehensive and systematic overview of all telemedicine technologies in Denmark. Results......Objectives: The aim of this study is to survey telemedicine services currently in operation across Denmark. The study specifically seeks to answer the following questions: What initiatives are deployed within the different regions? What are the motivations behind the projects? What technologies......: The results of this study suggest that a growing number of telemedicine initiatives are currently in operation across Denmark but that considerable variations existed in terms of regional efforts as the number of operational telemedicine projects varied from region to region. Conclusions: The results...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  6. Effects of cement organic additives on the adsorption of uranyl ions on calcium silicate hydrate phases: experimental determination and computational molecular modelling

    International Nuclear Information System (INIS)

    Androniuk, Iuliia

    2017-01-01

    Cementitious materials are extensively used in the design and construction of radioactive waste repositories. One of the ways to enhance their performance is to introduce organic admixtures into the cement structure. However, the presence of organics in the pore water may affect the radionuclide mobility: organic molecules can form water-soluble complexes and compete for sorption sites. This work was designed to get detailed understanding of the mechanisms of such interactions on the molecular level. The model system has three components. First, pure C-S-H phases with different Ca/Si ratios were chosen as a cement model. Secondly, gluconate (a simple well-described molecule) is selected as a good starting organic additive model to probe the interaction mechanisms on the molecular scale. A more complex system involving poly-carboxylate super-plasticizer (PCE) was also tested. The third, U(VI), is a representative of the actinide radionuclide series. The development of description of the effects of organics for radioactive waste disposal applications was the primary objective of this work. The study of binary systems provides reference data for the investigation of more complex ternary (C-S-H/organic/U(VI)). The interactions are studied by means of both experimental and computational molecular modelling techniques. Data on sorption and desorption kinetics and isotherms for additives and for U(VI) on C-S-H are acquired in this work. In parallel, atomistic models are developed for the interfaces of interest. Structural, energetic, and dynamic aspects of the sorption processes on surface of cement are quantitatively modeled by molecular dynamics technique. (author)

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  8. ECG-Based Detection of Early Myocardial Ischemia in a Computational Model: Impact of Additional Electrodes, Optimal Placement, and a New Feature for ST Deviation.

    Science.gov (United States)

    Loewe, Axel; Schulze, Walther H W; Jiang, Yuan; Wilhelms, Mathias; Luik, Armin; Dössel, Olaf; Seemann, Gunnar

    2015-01-01

    In case of chest pain, immediate diagnosis of myocardial ischemia is required to respond with an appropriate treatment. The diagnostic capability of the electrocardiogram (ECG), however, is strongly limited for ischemic events that do not lead to ST elevation. This computational study investigates the potential of different electrode setups in detecting early ischemia at 10 minutes after onset: standard 3-channel and 12-lead ECG as well as body surface potential maps (BSPMs). Further, it was assessed if an additional ECG electrode with optimized position or the right-sided Wilson leads can improve sensitivity of the standard 12-lead ECG. To this end, a simulation study was performed for 765 different locations and sizes of ischemia in the left ventricle. Improvements by adding a single, subject specifically optimized electrode were similar to those of the BSPM: 2-11% increased detection rate depending on the desired specificity. Adding right-sided Wilson leads had negligible effect. Absence of ST deviation could not be related to specific locations of the ischemic region or its transmurality. As alternative to the ST time integral as a feature of ST deviation, the K point deviation was introduced: the baseline deviation at the minimum of the ST-segment envelope signal, which increased 12-lead detection rate by 7% for a reasonable threshold.

  9. Swedish nuclear waste efforts

    International Nuclear Information System (INIS)

    Rydberg, J.

    1981-09-01

    After the introduction of a law prohibiting the start-up of any new nuclear power plant until the utility had shown that the waste produced by the plant could be taken care of in an absolutely safe way, the Swedish nuclear utilities in December 1976 embarked on the Nuclear Fuel Safety Project, which in November 1977 presented a first report, Handling of Spent Nuclear Fuel and Final Storage of Vitrified Waste (KBS-I), and in November 1978 a second report, Handling and Final Storage of Unreprocessed Spent Nuclear Fuel (KBS II). These summary reports were supported by 120 technical reports prepared by 450 experts. The project engaged 70 private and governmental institutions at a total cost of US $15 million. The KBS-I and KBS-II reports are summarized in this document, as are also continued waste research efforts carried out by KBS, SKBF, PRAV, ASEA and other Swedish organizations. The KBS reports describe all steps (except reprocessing) in handling chain from removal from a reactor of spent fuel elements until their radioactive waste products are finally disposed of, in canisters, in an underground granite depository. The KBS concept relies on engineered multibarrier systems in combination with final storage in thoroughly investigated stable geologic formations. This report also briefly describes other activities carried out by the nuclear industry, namely, the construction of a central storage facility for spent fuel elements (to be in operation by 1985), a repository for reactor waste (to be in operation by 1988), and an intermediate storage facility for vitrified high-level waste (to be in operation by 1990). The R and D activities are updated to September 1981

  10. Worldwide effort against smoking.

    Science.gov (United States)

    1986-07-01

    The 39th World Health Assembly, which met in May 1986, recognized the escalating health problem of smoking-related diseases and affirmed that tobacco smoking and its use in other forms are incompatible with the attainment of "Health for All by the Year 2000." If properly implemented, antismoking campaigns can decrease the prevalence of smoking. Nations as a whole must work toward changing smoking habits, and governments must support these efforts by officially stating their stand against smoking. Over 60 countries have introduced legislation affecting smoking. The variety of policies range from adopting a health education program designed to increase peoples' awareness of its dangers to increasing taxes to deter smoking by increasing tobacco prices. Each country must adopt an antismoking campaign which works most effectively within the cultural parameters of the society. Other smoking policies include: printed warnings on cigarette packages; health messages via radio, television, mobile teams, pamphlets, health workers, clinic walls, and newspapers; prohibition of smoking in public areas and transportation; prohibition of all advertisement of cigarettes and tobacco; and the establishment of upper limits of tar and nicotine content in cigarettes. The tobacco industry spends about $2000 million annually on worldwide advertising. According to the World Health Organization (WHO), controlling this overabundance of tobacco advertisements is a major priority in preventing the spread of smoking. Cigarette and tobacco advertising can be controlled to varying degrees, e.g., over a dozen countries have enacted a total ban on advertising on television or radio, a mandatory health warning must accompany advertisements in other countries, and tobacco companies often are prohibited from sponsoring sports events. Imposing a substantial tax on cigarettes is one of the most effective means to deter smoking. However, raising taxes and banning advertisements is not enough because

  11. Analysis Efforts Supporting NSTX Upgrades

    International Nuclear Information System (INIS)

    Zhang, H.; Titus, P.; Rogoff, P.; Zolfaghari, A.; Mangra, D.; Smith, M.

    2010-01-01

    The National Spherical Torus Experiment (NSTX) is a low aspect ratio, spherical torus (ST) configuration device which is located at Princeton Plasma Physics Laboratory (PPPL) This device is presently being updated to enhance its physics by doubling the TF field to 1 Tesla and increasing the plasma current to 2 Mega-amperes. The upgrades include a replacement of the centerstack and addition of a second neutral beam. The upgrade analyses have two missions. The first is to support design of new components, principally the centerstack, the second is to qualify existing NSTX components for higher loads, which will increase by a factor of four. Cost efficiency was a design goal for new equipment qualification, and reanalysis of the existing components. Showing that older components can sustain the increased loads has been a challenging effort in which designs had to be developed that would limit loading on weaker components, and would minimize the extent of modifications needed. Two areas representing this effort have been chosen to describe in more details: analysis of the current distribution in the new TF inner legs, and, second, analysis of the out-of-plane support of the existing TF outer legs.

  12. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    The project cogeneration plant supplies electric power, process steam and treated boiler feedwater for use by the project plants. The plant consists of multiple turbine generators and steam generators connected to a common main steam header. The major plant systems which are required to produce steam, electrical power and treated feedwater are discussed individually. The systems are: steam, steam generator, steam generator fuel, condensate and feedwater deaeration, condensate and blowdown collection, cooling water, boiler feedwater treatment, coal handling, ash handling (fly ash and bottom ash), electrical, and control system. The plant description is based on the Phase Zero design basis established for Plant 31 in July of 1980 and the steam/condensate balance as presented on Drawing 31-E-B-1. Updating of steam requirements as more refined process information becomes available has generated some changes in the steam balance. Boiler operation with these updated requirements is reflected on Drawing 31-D-B-1A. The major impact of updating has been that less 600 psig steam generated within the process units requires more extraction steam from the turbine generators to close the 600 psig steam balance. Since the 900 psig steam generation from the boilers was fixed at 1,200,000 lb/hr, the additional extraction steam required to close the 600 psig steam balance decreased the quantity of electrical power available from the turbine generators. In the next phase of engineering work, the production of 600 psig steam will be augmented by increasing convection bank steam generation in the Plant 3 fired heaters by 140,000 to 150,000 lb/hr. This modification will allow full rated power generation from the turbine generators.

  13. The influence of the speed of the down-ward leader channel in computation of additional charge for protection against direct lightning strike by charge transfer system in 'ultra-corona' mode

    International Nuclear Information System (INIS)

    Talevski, V.

    2012-01-01

    In this paper computation of additional charge is done for protection against direct lightning strike, by charge transfer system, by point electrode, in 'ultra-corona' mode. The influence of the voltage increase in a very small time interval is computed and the influence is taken into consideration in the computation of the additional space charge on the object used for protection. The model of the electrical thundercloud is taken into consideration with all the electrical charge in it with its corresponding heights above ground. Plotted values are presented of the speed of the down-ward leader from the cloud versus the additional space charge, needed to be placed on the top of the object protected by direct lightning. Plotted values are also presented of different position of the horizontal distance of the protected object and its height versus the additional space charge. (Authors)

  14. Egg origin determination efforts

    International Nuclear Information System (INIS)

    Horvath, A.; Futo, I.; Vodila, G.; Palcsu, L.

    2012-01-01

    Complete text of publication follows. As a co-operation with the Poultry Product Board, egg and drinking water samples were received in order to investigate whether the country of origin of the egg can be determined based on its stable isotope composition with the aim of market protection of the Hungarian eggs against the mislabelled foreign ones. The scientific background is that drinking water of egg laying hens is assumed to reflect the composition of regional precipitation, and it is also an input data in the process of egg formation. In the first sampling, altogether 23 sets of egg and drinking water samples were received from different production sites covering the whole area of Hungary. The egg white samples were vacuum distilled and frozen out by liquid nitrogen at -196 deg C. The process was monitored by two vacuum gauges. Water frozen out together with the drinking water samples was measured were measured by a Thermo Finnigan Delta PLUS XP isotope ratio mass spectrometer using a GasBench II peripheral unit equipped with a GC-autosampler. As a second issue, additionally, elemental composition of egg shells were also performed for series of Hungarian, Czech and Polish egg samples by energy dispersive X-ray fluorescence. The drinking waters fit well to the Global Meteoric Water Line indicating their precipitation origin. It was experienced that the water in egg white gets enriched compared to the drinking water (Δ 18 O = -4.9 ± 1.0 per thousand and Δ D = -21.8 ± 6.4 per thousand), however, this shift is independent of the type of the hens, since the mean shifts in the eggs of Tetra and Hy-line hens are similar within error bar. For more depleted drinking water, the shift of the egg white was higher than for more enriched ones. This can be due to the contribution of the nutriment isotopic composition. The water isotope composition of the Hungarian eggs investigated was δ 18 O = -4.8 - -7.3 per thousand and δD = -46.0 - -70.7 per thousand, therefore egg

  15. Visual cues and listening effort: individual variability.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2011-10-01

    To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and 2 presentation modalities (audio only [AO] and auditory-visual [AV]). Signal-to-noise ratios were adjusted to provide matched speech recognition across audio-only and AV noise conditions. Also measured were subjective perceptions of listening effort and 2 predictive variables: (a) lipreading ability and (b) WMC. Objective and subjective results indicated that listening effort increased in the presence of noise, but on average the addition of visual cues did not significantly affect the magnitude of listening effort. Although there was substantial individual variability, on average participants who were better lipreaders or had larger WMCs demonstrated reduced listening effort in noise in AV conditions. Overall, the results support the hypothesis that integrating auditory and visual cues requires cognitive resources in some participants. The data indicate that low lipreading ability or low WMC is associated with relatively effortful integration of auditory and visual information in noise.

  16. Standardization efforts in IP telephony

    Science.gov (United States)

    Sengodan, Senthil; Bansal, Raj

    1999-11-01

    The recent interest in IP telephony has led to a tremendous increase of standardization activities in the area. The three main standards bodies in the area of IP telephony are the International Telecommunication Union's (ITU-T) Study Group (SG) 16, the Internet Engineering Task Force (IETF) and the European Telecommunication Standards Institute's (ETSI) TIPHON project. In addition, forums such as the International Multimedia Teleconferencing Consortium (IMTC), the Intelligent Network Forum (INF), the International Softswitch Consortium (ISC), the Electronic Computer Telephony Forum (ECTF), and the MIT's Internet Telephony Consortium (ITC) are looking into various other aspects that aim at the growth of this industry. This paper describes the main tasks (completed and in progress) undertaken by these organizations. In describing such work, an overview of the underlying technology is also provided.

  17. Environmental Determinants of Lexical Processing Effort

    OpenAIRE

    McDonald, Scott

    2000-01-01

    Institute for Adaptive and Neural Computation A central concern of psycholinguistic research is explaining the relative ease or difficulty involved in processing words. In this thesis, we explore the connection between lexical processing effort and measurable properties of the linguistic environment. Distributional information (information about a word’s contexts of use) is easily extracted from large language corpora in the form of co-occurrence statistics. We claim that su...

  18. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  19. Cognitive effort: A neuroeconomic approach

    Science.gov (United States)

    Braver, Todd S.

    2015-01-01

    Cognitive effort has been implicated in numerous theories regarding normal and aberrant behavior and the physiological response to engagement with demanding tasks. Yet, despite broad interest, no unifying, operational definition of cognitive effort itself has been proposed. Here, we argue that the most intuitive and epistemologically valuable treatment is in terms of effort-based decision-making, and advocate a neuroeconomics-focused research strategy. We first outline psychological and neuroscientific theories of cognitive effort. Then we describe the benefits of a neuroeconomic research strategy, highlighting how it affords greater inferential traction than do traditional markers of cognitive effort, including self-reports and physiologic markers of autonomic arousal. Finally, we sketch a future series of studies that can leverage the full potential of the neuroeconomic approach toward understanding the cognitive and neural mechanisms that give rise to phenomenal, subjective cognitive effort. PMID:25673005

  20. Hydrogen economy: a little bit more effort

    International Nuclear Information System (INIS)

    Pauron, M.

    2008-01-01

    In few years, the use of hydrogen in economy has become a credible possibility. Today, billions of euros are invested in the hydrogen industry which is strengthened by technological advances in fuel cells development and by an increasing optimism. However, additional research efforts and more financing will be necessary to make the dream of an hydrogen-based economy a reality

  1. The value of indicated computed tomography scan of the chest and abdomen in addition to the conventional radiologic work-up for blunt trauma patients.

    NARCIS (Netherlands)

    Deunk, J.; Dekker, H.M.; Brink, M.; Vugt, R. van; Edwards, M.J.R.; Vugt, A.B. van

    2007-01-01

    BACKGROUND: Multidetector computed tomography (CT) is more sensitive and specific in detecting traumatic injuries than conventional radiology is. However, still little is known about the diagnostic value and the therapeutic impact of indicated thoraco-abdominal CT scan when it is performed in

  2. Comparative assessment of world research efforts on magnetic confinement fusion

    International Nuclear Information System (INIS)

    McKenney, B.L.; McGrain, M.; Rutherford, P.H.

    1990-02-01

    This report presents a comparative assessment of the world's four major research efforts on magnetic confinement fusion, including a comparison of the capabilities in the Soviet Union, the European Community (Western Europe), Japan, and the United States. A comparative evaluation is provided in six areas: tokamak confinement; alternate confinement approaches; plasma technology and engineering; and fusion computations. The panel members are involved actively in fusion-related research, and have extensive experience in previous assessments and reviews of the world's four major fusion programs. Although the world's four major fusion efforts are roughly comparable in overall capabilities, two conclusions of this report are inescapable. First, the Soviet fusion effort is presently the weakest of the four programs in most areas of the assessment. Second, if present trends continue, the United States, once unambiguously the world leader in fusion research, will soon lose its position of leadership to the West European and Japanese fusion programs. Indeed, before the middle 1990s, the upgraded large-tokamak facilities, JT-60U (Japan) and JET (Western Europe), are likely to explore plasma conditions and operating regimes well beyond the capabilities of the TFTR tokamak (United States). In addition, if present trends continue in the areas of fusion nuclear technology and materials, and plasma technology and materials, and plasma technology development, the capabilities of Japan and Western Europe in these areas (both with regard to test facilities and fusion-specific industrial capabilities) will surpass those of the United States by a substantial margin before the middle 1990s

  3. Multidisciplinary Efforts Driving Translational Theranostics

    Science.gov (United States)

    Hu, Tony Y.

    2014-01-01

    This themed issue summarizes significant efforts aimed at using “biological language” to discern between “friends” and “foes” in the context of theranostics for true clinical application. It is expected that the success of theranostics depends on multidisciplinary efforts, combined to expedite our understanding of host responses to “customized” theranostic agents and formulating individualized therapies. PMID:25285169

  4. Learning Environment and Student Effort

    Science.gov (United States)

    Hopland, Arnt O.; Nyhus, Ole Henning

    2016-01-01

    Purpose: The purpose of this paper is to explore the relationship between satisfaction with learning environment and student effort, both in class and with homework assignments. Design/methodology/approach: The authors use data from a nationwide and compulsory survey to analyze the relationship between learning environment and student effort. The…

  5. Respiratory effort from the photoplethysmogram.

    Science.gov (United States)

    Addison, Paul S

    2017-03-01

    The potential for a simple, non-invasive measure of respiratory effort based on the pulse oximeter signal - the photoplethysmogram or 'pleth' - was investigated in a pilot study. Several parameters were developed based on a variety of manifestations of respiratory effort in the signal, including modulation changes in amplitude, baseline, frequency and pulse transit times, as well as distinct baseline signal shifts. Thirteen candidate parameters were investigated using data from healthy volunteers. Each volunteer underwent a series of controlled respiratory effort maneuvers at various set flow resistances and respiratory rates. Six oximeter probes were tested at various body sites. In all, over three thousand pleth-based effort-airway pressure (EP) curves were generated across the various airway constrictions, respiratory efforts, respiratory rates, subjects, probe sites, and the candidate parameters considered. Regression analysis was performed to determine the existence of positive monotonic relationships between the respiratory effort parameters and resulting airway pressures. Six of the candidate parameters investigated exhibited a distinct positive relationship (poximeter probe and an ECG (P2E-Effort) and the other using two pulse oximeter probes placed at different peripheral body sites (P2-Effort); and baseline shifts in heart rate, (BL-HR-Effort). In conclusion, a clear monotonic relationship was found between several pleth-based parameters and imposed respiratory loadings at the mouth across a range of respiratory rates and flow constrictions. The results suggest that the pleth may provide a measure of changing upper airway dynamics indicative of the effort to breathe. Copyright © 2017 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. Additive manufacturing.

    Science.gov (United States)

    Mumith, A; Thomas, M; Shah, Z; Coathup, M; Blunn, G

    2018-04-01

    Increasing innovation in rapid prototyping (RP) and additive manufacturing (AM), also known as 3D printing, is bringing about major changes in translational surgical research. This review describes the current position in the use of additive manufacturing in orthopaedic surgery. Cite this article: Bone Joint J 2018;100-B:455-60.

  7. Overview of NASA/OAST efforts related to manufacturing technology

    Science.gov (United States)

    Saunders, N. T.

    1976-01-01

    An overview of some of NASA's current efforts related to manufacturing technology and some possible directions for the future are presented. The topics discussed are: computer-aided design, composite structures, and turbine engine components.

  8. Effort rights-based management

    DEFF Research Database (Denmark)

    Squires, Dale; Maunder, Mark; Allen, Robin

    2017-01-01

    Effort rights-based fisheries management (RBM) is less widely used than catch rights, whether for groups or individuals. Because RBM on catch or effort necessarily requires a total allowable catch (TAC) or total allowable effort (TAE), RBM is discussed in conjunction with issues in assessing fish...... populations and providing TACs or TAEs. Both approaches have advantages and disadvantages, and there are trade-offs between the two approaches. In a narrow economic sense, catch rights are superior because of the type of incentives created, but once the costs of research to improve stock assessments...

  9. Global Data Grid Efforts for ATLAS

    CERN Multimedia

    Gardner, R.

    2001-01-01

    Over the past two years computational data grids have emerged as a promising new technology for large scale, data-intensive computing required by the LHC experiments, as outlined by the recent "Hoffman" review panel that addressed the LHC computing challenge. The problem essentially is to seamlessly link physicists to petabyte-scale data and computing resources, distributed worldwide, and connected by high-bandwidth research networks. Several new collaborative initiatives in Europe, the United States, and Asia have formed to address the problem. These projects are of great interest to ATLAS physicists and software developers since their objective is to offer tools that can be integrated into the core ATLAS application framework for distributed event reconstruction, Monte Carlo simulation, and data analysis, making it possible for individuals and groups of physicists to share information, data, and computing resources in new ways and at scales not previously attempted. In addition, much of the distributed IT...

  10. Pedigree-based estimation of covariance between dominance deviations and additive genetic effects in closed rabbit lines considering inbreeding and using a computationally simpler equivalent model.

    Science.gov (United States)

    Fernández, E N; Legarra, A; Martínez, R; Sánchez, J P; Baselga, M

    2017-06-01

    Inbreeding generates covariances between additive and dominance effects (breeding values and dominance deviations). In this work, we developed and applied models for estimation of dominance and additive genetic variances and their covariance, a model that we call "full dominance," from pedigree and phenotypic data. Estimates with this model such as presented here are very scarce both in livestock and in wild genetics. First, we estimated pedigree-based condensed probabilities of identity using recursion. Second, we developed an equivalent linear model in which variance components can be estimated using closed-form algorithms such as REML or Gibbs sampling and existing software. Third, we present a new method to refer the estimated variance components to meaningful parameters in a particular population, i.e., final partially inbred generations as opposed to outbred base populations. We applied these developments to three closed rabbit lines (A, V and H) selected for number of weaned at the Polytechnic University of Valencia. Pedigree and phenotypes are complete and span 43, 39 and 14 generations, respectively. Estimates of broad-sense heritability are 0.07, 0.07 and 0.05 at the base versus 0.07, 0.07 and 0.09 in the final generations. Narrow-sense heritability estimates are 0.06, 0.06 and 0.02 at the base versus 0.04, 0.04 and 0.01 at the final generations. There is also a reduction in the genotypic variance due to the negative additive-dominance correlation. Thus, the contribution of dominance variation is fairly large and increases with inbreeding and (over)compensates for the loss in additive variation. In addition, estimates of the additive-dominance correlation are -0.37, -0.31 and 0.00, in agreement with the few published estimates and theoretical considerations. © 2017 Blackwell Verlag GmbH.

  11. Food additives

    Science.gov (United States)

    ... GO About MedlinePlus Site Map FAQs Customer Support Health Topics Drugs & Supplements Videos & Tools Español You Are Here: Home → Medical Encyclopedia → Food additives URL of this page: //medlineplus.gov/ency/article/ ...

  12. Pandemic Influenza: Domestic Preparedness Efforts

    National Research Council Canada - National Science Library

    Lister, Sarah A

    2005-01-01

    .... Though influenza pandemics occur with some regularity, and the United States has been involved in specific planning efforts since the early 1990s, the H5N1 situation has created a sense of urgency...

  13. Additivity of Factor Effects in Reading Tasks Is Still a Challenge for Computational Models: Reply to Ziegler, Perry, and Zorzi (2009)

    Science.gov (United States)

    Besner, Derek; O'Malley, Shannon

    2009-01-01

    J. C. Ziegler, C. Perry, and M. Zorzi (2009) have claimed that their connectionist dual process model (CDP+) can simulate the data reported by S. O'Malley and D. Besner. Most centrally, they have claimed that the model simulates additive effects of stimulus quality and word frequency on the time to read aloud when words and nonwords are randomly…

  14. A new combined computational and NMR-spectroscopical strategy for the identification of additional conformational constraints of the bound ligand in an aprotic solvent

    NARCIS (Netherlands)

    Vliegenthart, J.F.G.; Siebert, H.-C; André, S.; Asensio, J.l.; Cañada, F.J.; Dong, X.; Espinosa, M.; Frank, M.

    2000-01-01

    This study documents the feasibility of switching to an aprotic medium in sugar receptor research. The solvent change offers additional insights into mechanistic details of receptor-carbohydrate ligand interactions. If a receptor retained binding capacity in an aprotic medium, solvent-exchangeable

  15. Calcul des efforts de deuxième ordre à très haute fréquence sur des plates-formes à lignes tendues Computing High-Frequency Second Order Loads on Tension Leg Platforms

    Directory of Open Access Journals (Sweden)

    Chen X.

    2006-11-01

    Full Text Available Le problème considéré ici est celui de l'évaluation des efforts excitateurs de deuxième ordre (en mode somme, c'est-à-dire prenant place aux sommes deux à deux des fréquences de houle sur des plates-formes à lignes tendues. Ces efforts sont tenus pour responsables de comportements résonnants (en roulis, tangage et pilonnement observés lors d'essais en bassin et pourraient réduire sensiblement la durée de vie en fatigue des tendons. Des résultats sont tout d'abord présentés pour une structure simplifiée, consistant en 4 cylindres verticaux reposant sur le fond marin. L'intérêt de cette géométrie est que tous les calculs peuvent être menés à terme de façon quasi analytique. Les résultats obtenus permettent d'illustrer le haut degré d'interaction entre les colonnes, et la faible décroissance du potentiel de diffraction de deuxième ordre avec la profondeur. On présente ensuite des résultats pour une plate-forme réelle, celle de Snorre. Tension Leg Platforms (TLP's are now regarded as a promising technology for the development of deep offshore fields. As the water depth increases however, their natural periods of heave, roll and pitch tend to increase as well (roughly to the one-half power, and it is not clear yet what the maximum permissible values for these natural periods can be. For the Snorre TLP for instance, they are only about 2. 5 seconds, which seems to be sufficiently low since there is very limited free wave energy at such periods. Model tests, however, have shown some resonant response in sea states with peak periods of about 5 seconds. Often referred to as springing , this resonant motion can severely affect the fatigue life of tethers and increase their design loads. In order to calculate this springing motion at the design stage, it is necessary to identify and evaluate both the exciting loads and the mechanisms of energy dissipation. With the help of the French Norwegian Foundation a joint effort was

  16. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  17. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  18. Computer modeling of inhibition of α-radiolysis of water by H2 addition (9. International Workshop on Radiolysis, Electrochemistry and Materials Performance)

    International Nuclear Information System (INIS)

    Lertnaisat, Phantira; Katsumura, Yosuke; Mukai, Satoru; Umehara, Ryuji; Shimizu, Yuichi; Suzuki, Masaru

    2012-09-01

    It is known that α-radiolysis of water produces H 2 gas continuously. The addition of H 2 to water inhibits the water decomposition; H 2 evolution. In order to suppress the water decomposition, 25 cc H 2 STP/kg-H 2 O is added to the coolant water in PWR. However, the exact inhibition mechanism is still not made clear yet. In this project, the chemical kinetic simulation program, so called FASCIMILE, was used to reproduce the suppression of α-radiolysis of water by H 2 addition. By using three important factors; the decomposition (G-value), the reaction set and rate constants, and the dose rate, it is found that without hydrogen addition, the simulation shows the almost linear increase of molecular products; H 2 , H 2 O 2 , and O 2 . Nevertheless, as the additional hydrogen is added to the system, this behaviour of linear increase is shifted to longer time period. And up to certain concentration, the linear increase behaviour is completely suppressed and the molecular products reach the steady state condition at early time period and much lower concentration. The minimum concentration of H 2 which could completely suppress the decomposition of water is called Critical Hydrogen Concentration (CHC) and it is dose rate dependent value. The CHC is found to be dependent on the reaction set and rate constants. The simulation results show that the CHC at room temperature and dose rate of 1 kGy/s of the simulation done by using reaction set and rate constants obtained from Ershov et al. and AECL report 2009 are 165μM and 146 μM, respectively. From the change of the behaviour of molecular products after reaching the CHC, the possible mechanism is proposed. First, the OH radical are formed via the reaction of H + H 2 O 2 → OH + H 2 O and e - aq + H 2 O 2 → OH+OH - . Then OH, which normally will react with H 2 O 2 to produced HO 2 , will react with the additional H 2 , which produce H to continue the chain reaction. The relation of chain reaction to the suppression of

  19. Dopamine, behavioral economics, and effort

    Directory of Open Access Journals (Sweden)

    John D Salamone

    2009-09-01

    Full Text Available Abstract. There are numerous problems with the hypothesis that brain dopamine (DA systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Research and theory related to the functions of mesolimbic DA are undergoing a substantial conceptual restructuring, with the traditional emphasis on hedonia and primary reward yielding to other concepts and lines of inquiry. The present review is focused upon the involvement of nucleus accumbens DA in behavioral activation and effort-related processes. Viewed from the framework of behavioral economics, the effects of accumbens DA depletions and antagonism on food-reinforced behavior are highly dependent upon the work requirements of the instrumental task, and DA depleted rats are more sensitive to increases in response costs (i.e., ratio requirements. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related choice behavior. Rats with accumbens DA depletions or antagonism reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead these rats select a less-effortful type of food-seeking behavior. Nucleus accumbens DA and adenosine interact in the regulation of effort-related functions, and other brain structures (anterior cingulate cortex, amygdala, ventral pallidum also are involved. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue or anergia in depression and other neurological disorders.

  20. Maximum effort in the minimum-effort game

    Czech Academy of Sciences Publication Activity Database

    Engelmann, Dirk; Normann, H.-T.

    2010-01-01

    Roč. 13, č. 3 (2010), s. 249-259 ISSN 1386-4157 Institutional research plan: CEZ:AV0Z70850503 Keywords : minimum-effort game * coordination game * experiments * social capital Subject RIV: AH - Economics Impact factor: 1.868, year: 2010

  1. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  2. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  3. Additional benefit of {sup 18}F-fluorodeoxyglucose integrated positron emission tomography/computed tomography in the staging of oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Gillies, R.S. [Oxford Cancer and Haematology Centre, Churchill Hospital, Department of Medical Oncology, Oxford (United Kingdom); Oxford Cancer and Haematology Centre, Churchill Hospital, Department of Oesophagogastric Surgery, Oxford (United Kingdom); Middleton, M.R. [Oxford Cancer and Haematology Centre, Churchill Hospital, Department of Medical Oncology, Oxford (United Kingdom); Maynard, N.D. [Oxford Cancer and Haematology Centre, Churchill Hospital, Department of Oesophagogastric Surgery, Oxford (United Kingdom); Bradley, K.M.; Gleeson, F.V. [Oxford Cancer and Haematology Centre, Churchill Hospital, Department of Radiology, Oxford (United Kingdom)

    2011-02-15

    {sup 18}F-fluorodeoxyglucose positron emission tomography (FDG PET) has been shown to improve the accuracy of staging in oesophageal cancer. We assessed the benefit of PET/CT over conventional staging and determined if tumour histology had any significant impact on PET/CT findings. A retrospective cohort study, reviewing the results from 200 consecutive patients considered suitable for radical treatment, undergoing routine PET/CT staging comparing the results from CT and endoscopic ultrasound, as well as multi-disciplinary team records. Adenocarcinoma and squamous cell carcinoma were compared for maximum Standardised Uptake Value (SUV{sub max}), involvement of local lymph nodes and distant metastases. PET/CT provided additional information in 37 patients (18.5%) and directly altered management in 34 (17%): 22 (11%) were upstaged; 15 (7.5%) were downstaged, 12 of whom (6%) received radical treatment. There were 11 false negatives (5.5%) and 1 false positive (0.5%). SUV{sub max} was significantly lower for adenocarcinoma than squamous cell carcinoma (median 9.1 versus 13.5, p = 0.003). Staging with PET/CT offers additional benefit over conventional imaging and should form part of routine staging for oesophageal cancer. Adenocarcinoma and squamous cell carcinoma display significantly different FDG-avidity. (orig.)

  4. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  5. Competition for marine space: modelling the Baltic Sea fisheries and effort displacement under spatial restrictions

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Eigaard, Ole Ritzau

    2015-01-01

    DISPLACE model) to combine stochastic variations in spatial fishing activities with harvested resource dynamics in scenario projections. The assessment computes economic and stock status indicators by modelling the activity of Danish, Swedish, and German vessels (.12 m) in the international western Baltic...... Sea commercial fishery, together with the underlying size-based distribution dynamics of the main fishery resources of sprat, herring, and cod. The outcomes of alternative scenarios for spatial effort displacement are exemplified by evaluating the fishers’s abilities to adapt to spatial plans under...... various constraints. Interlinked spatial, technical, and biological dynamics of vessels and stocks in the scenarios result in stable profits, which compensate for the additional costs from effort displacement and release pressure on the fish stocks. The effort is further redirected away from sensitive...

  6. Building a Science Software Institute: Synthesizing the Lessons Learned from the ISEES and WSSI Software Institute Conceptualization Efforts

    Science.gov (United States)

    Idaszak, R.; Lenhardt, W. C.; Jones, M. B.; Ahalt, S.; Schildhauer, M.; Hampton, S. E.

    2014-12-01

    The NSF, in an effort to support the creation of sustainable science software, funded 16 science software institute conceptualization efforts. The goal of these conceptualization efforts is to explore approaches to creating the institutional, sociological, and physical infrastructures to support sustainable science software. This paper will present the lessons learned from two of these conceptualization efforts, the Institute for Sustainable Earth and Environmental Software (ISEES - http://isees.nceas.ucsb.edu) and the Water Science Software Institute (WSSI - http://waters2i2.org). ISEES is a multi-partner effort led by National Center for Ecological Analysis and Synthesis (NCEAS). WSSI, also a multi-partner effort, is led by the Renaissance Computing Institute (RENCI). The two conceptualization efforts have been collaborating due to the complementarity of their approaches and given the potential synergies of their science focus. ISEES and WSSI have engaged in a number of activities to address the challenges of science software such as workshops, hackathons, and coding efforts. More recently, the two institutes have also collaborated on joint activities including training, proposals, and papers. In addition to presenting lessons learned, this paper will synthesize across the two efforts to project a unified vision for a science software institute.

  7. Reproductive effort in viscous populations

    NARCIS (Netherlands)

    Pen, Ido

    Here I study a kin selection model of reproductive effort, the allocation of resources to fecundity versus survival, in a patch-structured population. Breeding females remain in the same patch for life. Offspring have costly, partial long-distance dispersal and compete for breeding sites, which

  8. Toward a Rational and Mechanistic Account of Mental Effort.

    Science.gov (United States)

    Shenhav, Amitai; Musslick, Sebastian; Lieder, Falk; Kool, Wouter; Griffiths, Thomas L; Cohen, Jonathan D; Botvinick, Matthew M

    2017-07-25

    In spite of its familiar phenomenology, the mechanistic basis for mental effort remains poorly understood. Although most researchers agree that mental effort is aversive and stems from limitations in our capacity to exercise cognitive control, it is unclear what gives rise to those limitations and why they result in an experience of control as costly. The presence of these control costs also raises further questions regarding how best to allocate mental effort to minimize those costs and maximize the attendant benefits. This review explores recent advances in computational modeling and empirical research aimed at addressing these questions at the level of psychological process and neural mechanism, examining both the limitations to mental effort exertion and how we manage those limited cognitive resources. We conclude by identifying remaining challenges for theoretical accounts of mental effort as well as possible applications of the available findings to understanding the causes of and potential solutions for apparent failures to exert the mental effort required of us.

  9. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  10. Private Speech Moderates the Effects of Effortful Control on Emotionality

    Science.gov (United States)

    Day, Kimberly L.; Smith, Cynthia L.; Neal, Amy; Dunsmore, Julie C.

    2018-01-01

    Research Findings: In addition to being a regulatory strategy, children's private speech may enhance or interfere with their effortful control used to regulate emotion. The goal of the current study was to investigate whether children's private speech during a selective attention task moderated the relations of their effortful control to their…

  11. Voluntary versus Enforced Team Effort

    Directory of Open Access Journals (Sweden)

    Claudia Keser

    2011-08-01

    Full Text Available We present a model where each of two players chooses between remuneration based on either private or team effort. Although at least one of the players has the equilibrium strategy to choose private remuneration, we frequently observe both players to choose team remuneration in a series of laboratory experiments. This allows for high cooperation payoffs but also provides individual free-riding incentives. Due to significant cooperation, we observe that, in team remuneration, participants make higher profits than in private remuneration. We also observe that, when participants are not given the option of private remuneration, they cooperate significantly less.

  12. Additive manufacturing technology in reconstructive surgery.

    Science.gov (United States)

    Fuller, Scott C; Moore, Michael G

    2016-10-01

    Technological advances have been part and parcel of modern reconstructive surgery, in that practitioners of this discipline are continually looking for innovative ways to perfect their craft and improve patient outcomes. We are currently in a technological climate wherein advances in computers, imaging, and science have coalesced with resulting innovative breakthroughs that are not merely limited to improved outcomes and enhanced patient care, but may provide novel approaches to training the next generation of reconstructive surgeons. New developments in software and modeling platforms, imaging modalities, tissue engineering, additive manufacturing, and customization of implants are poised to revolutionize the field of reconstructive surgery. The interface between technological advances and reconstructive surgery continues to expand. Additive manufacturing techniques continue to evolve in an effort to improve patient outcomes, decrease operative time, and serve as instructional tools for the training of reconstructive surgeons.

  13. APS Education and Diversity Efforts

    Science.gov (United States)

    Prestridge, Katherine; Hodapp, Theodore

    2015-11-01

    American Physical Society (APS) has a wide range of education and diversity programs and activities, including programs that improve physics education, increase diversity, provide outreach to the public, and impact public policy. We present the latest programs spearheaded by the Committee on the Status of Women in Physics (CSWP), with highlights from other diversity and education efforts. The CSWP is working to increase the fraction of women in physics, understand and implement solutions for gender-specific issues, enhance professional development opportunities for women in physics, and remedy issues that impact gender inequality in physics. The Conferences for Undergraduate Women in Physics, Professional Skills Development Workshops, and our new Professional Skills program for students and postdocs are all working towards meeting these goals. The CSWP also has site visit and conversation visit programs, where department chairs request that the APS assess the climate for women in their departments or facilitate climate discussions. APS also has two significant programs to increase participation by underrepresented minorities (URM). The newest program, the APS National Mentoring Community, is working to provide mentoring to URM undergraduates, and the APS Bridge Program is an established effort that is dramatically increasing the number of URM PhDs in physics.

  14. An opportunity cost model of subjective effort and task performance

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  15. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  16. Computer aids for plant operators

    International Nuclear Information System (INIS)

    Joly, J.P.

    1992-01-01

    For some time, particularly since the TMI accident, nuclear power plant operators have been aware of the difficulties involved in diagnosing accidents and returning plants to their stable, safe operating mode. There are various possible solutions to these problems: improve control organization during accident situations, rewrite control procedures, integrate safety engineers in shifts, improve control rooms, and implement additional computer aids. The purpose of this presentation is to describe the efforts undertaken by EDF over the last few years in this field

  17. Stretch-sensitive paresis and effort perception in hemiparesis.

    Science.gov (United States)

    Vinti, Maria; Bayle, Nicolas; Hutin, Emilie; Burke, David; Gracies, Jean-Michel

    2015-08-01

    In spastic paresis, stretch applied to the antagonist increases its inappropriate recruitment during agonist command (spastic co-contraction). It is unknown whether antagonist stretch: (1) also affects agonist recruitment; (2) alters effort perception. We quantified voluntary activation of ankle dorsiflexors, effort perception, and plantar flexor co-contraction during graded dorsiflexion efforts at two gastrocnemius lengths. Eighteen healthy (age 41 ± 13) and 18 hemiparetic (age 54 ± 12) subjects performed light, medium and maximal isometric dorsiflexion efforts with the knee flexed or extended. We determined dorsiflexor torque, Root Mean Square EMG and Agonist Recruitment/Co-contraction Indices (ARI/CCI) from the 500 ms peak voluntary agonist recruitment in a 5-s maximal isometric effort in tibialis anterior, soleus and medial gastrocnemius. Subjects retrospectively reported effort perception on a 10-point visual analog scale. During gastrocnemius stretch in hemiparetic subjects, we observed: (1) a 25 ± 7 % reduction of tibialis anterior voluntary activation (maximum reduction 98 %; knee extended vs knee flexed; p = 0.007, ANOVA); (2) an increase in dorsiflexion effort perception (p = 0.03, ANCOVA). Such changes did not occur in healthy subjects. Effort perception depended on tibialis anterior recruitment only (βARI(TA) = 0.61, p hemiparesis, voluntary ability to recruit agonist motoneurones is impaired--sometimes abolished--by antagonist stretch, a phenomenon defined here as stretch-sensitive paresis. In addition, spastic co-contraction increases effort perception, an additional incentive to evaluate and treat this phenomenon.

  18. Termination of prehospital resuscitative efforts

    DEFF Research Database (Denmark)

    Mikkelsen, Søren; Schaffalitzky de Muckadell, Caroline; Binderup, Lars Grassmé

    2017-01-01

    -and-death decision-making in the patient's medical records is required. We suggest that a template be implemented in the prehospital medical records describing the basis for any ethical decisions. This template should contain information regarding the persons involved in the deliberations and notes on ethical......BACKGROUND: Discussions on ethical aspects of life-and-death decisions within the hospital are often made in plenary. The prehospital physician, however, may be faced with ethical dilemmas in life-and-death decisions when time-critical decisions to initiate or refrain from resuscitative efforts...... need to be taken without the possibility to discuss matters with colleagues. Little is known whether these considerations regarding ethical issues in crucial life-and-death decisions are documented prehospitally. This is a review of the ethical considerations documented in the prehospital medical...

  19. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

    OpenAIRE

    Walter Ritter

    2011-01-01

    A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

  20. Model Additional Protocol

    International Nuclear Information System (INIS)

    Rockwood, Laura

    2001-01-01

    Since the end of the cold war a series of events has changed the circumstances and requirements of the safeguards system. The discovery of a clandestine nuclear weapons program in Iraq, the continuing difficulty in verifying the initial report of Democratic People's Republic of Korea upon entry into force of their safeguards agreement, and the decision of the South African Government to give up its nuclear weapons program and join the Treaty on the Non-Proliferation of Nuclear Weapons have all played a role in an ambitious effort by IAEA Member States and the Secretariat to strengthen the safeguards system. A major milestone in this effort was reached in May 1997 when the IAEA Board of Governors approved a Model Protocol Additional to Safeguards Agreements. The Model Additional Protocol was negotiated over a period of less than a year by an open-ended committee of the Board involving some 70 Member States and two regional inspectorates. The IAEA is now in the process of negotiating additional protocols, State by State, and implementing them. These additional protocols will provide the IAEA with rights of access to information about all activities related to the use of nuclear material in States with comprehensive safeguards agreements and greatly expanded physical access for IAEA inspectors to confirm or verify this information. In conjunction with this, the IAEA is working on the integration of these measures with those provided for in comprehensive safeguards agreements, with a view to maximizing the effectiveness and efficiency, within available resources, the implementation of safeguards. Details concerning the Model Additional Protocol are given. (author)

  1. [Limitation of the therapeutic effort].

    Science.gov (United States)

    Herreros, B; Palacios, G; Pacho, E

    2012-03-01

    The limitation of the therapeutic effort (LTE) consists in not applying extraordinary or disproportionate measures for therapeutic purposes that are proposed for a patient with poor life prognosis and/or poor quality of life. There are two types. The first is to not initiate certain measures or to withdraw them when they are established. A decision of the LTE should be based on some rigorous criteria, so that we make the following proposal. First, it is necessary to know the most relevant details of the case to make a decision: the preferences of the patient, the preferences of the family when pertinent, the prognosis (severity), the quality of life and distribution of the limited resources. After, the decision should be made. In this phase, participatory deliberation should be established to clarify the end of the intervention. Finally, if it is decided to perform an LTE, it should be decided how to do it. Special procedures, disproportionate measures, that are useless and vain should not be initiated for the therapeutic objective designed (withdraw them if they have been established). When it has been decided to treat a condition (interim measures), the treatment should be maintained. This complex phase may need stratification of he measures. Finally, the necessary palliative measures should be established. Copyright © 2011 Elsevier España, S.L. All rights reserved.

  2. Mental and physical effort affect vigilance differently

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  3. Mental and physical effort affect vigilance differently.

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  4. Trust Trust Me (The Additivity)

    OpenAIRE

    Mano , Ken; Sakurada , Hideki; Tsukada , Yasuyuki

    2017-01-01

    Part 4: Trust Metrics; International audience; We present a mathematical formulation of a trust metric using a quality and quantity pair. Under a certain assumption, we regard trust as an additive value and define the soundness of a trust computation as not to exceed the total sum. Moreover, we point out the importance of not only soundness of each computed trust but also the stability of the trust computation procedure against changes in trust value assignment. In this setting, we define tru...

  5. Kuwait poised for massive well kill effort

    Energy Technology Data Exchange (ETDEWEB)

    1991-04-08

    This paper reports that full scale efforts to extinguish Kuwait's oil well fires are to begin. The campaign to combat history's worst oil fires, originally expected to begin in mid-March, has been hamstrung by logistical problems, including delays in equipment deliveries caused by damage to Kuwait's infrastructure. Meantime, production from a key field off Kuwait--largely unaffected by the war--is expected to resume in May, but Kuwaiti oil exports will still be hindered by damaged onshore facilities. In addition, Kuwait is lining up equipment and personnel to restore production from its heavily damaged oil fields. Elsewhere in the Persian Gulf, Saudi Arabia reports progress in combating history's worst oil spills but acknowledges a continuing threat.

  6. Perception of effort in Exercise Science: Definition, measurement and perspectives.

    Science.gov (United States)

    Pageaux, Benjamin

    2016-11-01

    Perception of effort, also known as perceived exertion or sense of effort, can be described as a cognitive feeling of work associated with voluntary actions. The aim of the present review is to provide an overview of what is perception of effort in Exercise Science. Due to the addition of sensations other than effort in its definition, the neurophysiology of perceived exertion remains poorly understood. As humans have the ability to dissociate effort from other sensations related to physical exercise, the need to use a narrower definition is emphasised. Consequently, a definition and some brief guidelines for its measurement are provided. Finally, an overview of the models present in the literature aiming to explain its neurophysiology, and some perspectives for future research are offered.

  7. DC Control Effort Minimized for Magnetic-Bearing-Supported Shaft

    Science.gov (United States)

    Brown, Gerald V.

    2001-01-01

    A magnetic-bearing-supported shaft may have a number of concentricity and alignment problems. One of these involves the relationship of the position sensors, the centerline of the backup bearings, and the magnetic center of the magnetic bearings. For magnetic bearings with permanent magnet biasing, the average control current for a given control axis that is not bearing the shaft weight will be minimized if the shaft is centered, on average over a revolution, at the magnetic center of the bearings. That position may not yield zero sensor output or center the shaft in the backup bearing clearance. The desired shaft position that gives zero average current can be achieved if a simple additional term is added to the control law. Suppose that the instantaneous control currents from each bearing are available from measurements and can be input into the control computer. If each control current is integrated with a very small rate of accumulation and the result is added to the control output, the shaft will gradually move to a position where the control current averages to zero over many revolutions. This will occur regardless of any offsets of the position sensor inputs. At that position, the average control effort is minimized in comparison to other possible locations of the shaft. Nonlinearities of the magnetic bearing are minimized at that location as well.

  8. Trust and Reciprocity: Are Effort and Money Equivalent?

    Science.gov (United States)

    Vilares, Iris; Dam, Gregory; Kording, Konrad

    2011-01-01

    Trust and reciprocity facilitate cooperation and are relevant to virtually all human interactions. They are typically studied using trust games: one subject gives (entrusts) money to another subject, which may return some of the proceeds (reciprocate). Currently, however, it is unclear whether trust and reciprocity in monetary transactions are similar in other settings, such as physical effort. Trust and reciprocity of physical effort are important as many everyday decisions imply an exchange of physical effort, and such exchange is central to labor relations. Here we studied a trust game based on physical effort and compared the results with those of a computationally equivalent monetary trust game. We found no significant difference between effort and money conditions in both the amount trusted and the quantity reciprocated. Moreover, there is a high positive correlation in subjects' behavior across conditions. This suggests that trust and reciprocity may be character traits: subjects that are trustful/trustworthy in monetary settings behave similarly during exchanges of physical effort. Our results validate the use of trust games to study exchanges in physical effort and to characterize inter-subject differences in trust and reciprocity, and also suggest a new behavioral paradigm to study these differences. PMID:21364931

  9. Minimal-effort planning of active alignment processes for beam-shaping optics

    Science.gov (United States)

    Haag, Sebastian; Schranner, Matthias; Müller, Tobias; Zontar, Daniel; Schlette, Christian; Losch, Daniel; Brecher, Christian; Roßmann, Jürgen

    2015-03-01

    In science and industry, the alignment of beam-shaping optics is usually a manual procedure. Many industrial applications utilizing beam-shaping optical systems require more scalable production solutions and therefore effort has been invested in research regarding the automation of optics assembly. In previous works, the authors and other researchers have proven the feasibility of automated alignment of beam-shaping optics such as collimation lenses or homogenization optics. Nevertheless, the planning efforts as well as additional knowledge from the fields of automation and control required for such alignment processes are immense. This paper presents a novel approach of planning active alignment processes of beam-shaping optics with the focus of minimizing the planning efforts for active alignment. The approach utilizes optical simulation and the genetic programming paradigm from computer science for automatically extracting features from a simulated data basis with a high correlation coefficient regarding the individual degrees of freedom of alignment. The strategy is capable of finding active alignment strategies that can be executed by an automated assembly system. The paper presents a tool making the algorithm available to end-users and it discusses the results of planning the active alignment of the well-known assembly of a fast-axis collimator. The paper concludes with an outlook on the transferability to other use cases such as application specific intensity distributions which will benefit from reduced planning efforts.

  10. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Using a cloud to replenish parched groundwater modeling efforts.

    Science.gov (United States)

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  12. Using a cloud to replenish parched groundwater modeling efforts

    Science.gov (United States)

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  13. Measuring collections effort improves cash performance.

    Science.gov (United States)

    Shutts, Joe

    2009-09-01

    Having a satisfied work force can lead to an improved collections effort. Hiring the right people and training them ensures employee engagement. Measuring collections effort and offering incentives is key to revenue cycle success.

  14. Supporting Students as Scientists: One Mission's Efforts

    Science.gov (United States)

    Taylor, J.; Chambers, L. H.; Trepte, C. R.

    2012-12-01

    NASA's CALIPSO satellite mission provides an array of opportunities for teachers, students, and the general public. In developing our latest plan for education and public outreach, CALIPSO focused on efforts that would support students as scientists. CALIPSO EPO activities are aimed at inspiring young scientists through multiple avenues of potential contact, including: educator professional development, student-scientist mentoring, curriculum resource development, and public outreach through collaborative mission efforts. In this session, we will explore how these avenues complement one another and take a closer look at the development of the educator professional development activities. As part of CALIPSO's EPO efforts, we have developed the GLOBE Atmosphere Investigations Programs (AIP). The program encourages students to engage in authentic science through research on the atmosphere. The National Research Council (NRC) has emphasized the importance of teaching scientific inquiry in the National Science Education Standards (1996, 2000) and scientific practice in the recent Framework for K-12 Science Education (2011). In order to encourage student-centered science inquiry, teacher training utilizing GLOBE Atmosphere Investigations and GLOBE's Student Research Process are provided to middle and high school teachers to assist them in incorporating real scientific investigations into their classroom. Through participation in the program, teachers become a part of GLOBE (Global Learning and Observations to Benefit the Environment) - an international community of teachers, students, and scientists studying environmental science in over 24,000 schools around the world. The program uses NASA's satellites and the collection of atmosphere data by students to provide an engaging science learning experience for the students, and teachers. The GLOBE Atmosphere Investigations program offers year-long support to both teachers and students through direct involvement with NASA

  15. Indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced computed tomography: Assessment of the additional diagnostic value of contrast-enhanced ultrasound in the non-cirrhotic liver

    International Nuclear Information System (INIS)

    Quaia, Emilio; De Paoli, Luca; Angileri, Roberta; Cabibbo, Biagio; Cova, Maria Assunta

    2014-01-01

    Objective: To assess the additional diagnostic value of contrast-enhanced ultrasound (CEUS) in the characterization of indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced computed tomography (CT). Methods: Fifty-five solid hepatic lesions (1–4 cm in diameter) in 46 non-cirrhotic patients (26 female, 20 male; age ± SD, 55 ± 10 years) underwent CEUS after being detected on contrast-enhanced CT which was considered as non-diagnostic after on-site analysis. Two blinded independent readers assessed CT and CEUS scans and were asked to classify retrospectively each lesion as a malignant or benign based on reference diagnostic criteria for the different hepatic lesion histotypes. Diagnostic accuracy and confidence (area – A z – under ROC curve) were assessed by using gadobenate dimeglumine-enhanced magnetic resonance (MR) imaging (n = 30 lesions), histology (n = 7 lesions), or US follow-up (n = 18 lesions) as the reference standards. Results: Final diagnoses included 29 hemangiomas, 3 focal nodular hyperplasias, 1 hepatocellular adenoma, and 22 metastases. The additional review of CEUS after CT images improved significantly (P < .05) the diagnostic accuracy (before vs after CEUS review = 49% [20/55] vs 89% [49/55] – reader 1 and 43% [24/55] vs 92% [51/55] – reader 2) and confidence (A z , 95% Confidence Intervals before vs after CEUS review = .773 [.652–.895] vs .997 [.987–1] – reader 1 and .831 [.724–.938] vs .998 [.992–1] – reader 2). Conclusions: CEUS improved the characterization of indeterminate solid hepatic lesions identified on non-diagnostic contrast-enhanced CT by identifying some specific contrast enhancement patterns.

  16. Application of a General Computer Algorithm Based on the Group-Additivity Method for the Calculation of Two Molecular Descriptors at Both Ends of Dilution: Liquid Viscosity and Activity Coefficient in Water at Infinite Dilution

    Directory of Open Access Journals (Sweden)

    Rudolf Naef

    2017-12-01

    Full Text Available The application of a commonly used computer algorithm based on the group-additivity method for the calculation of the liquid viscosity coefficient at 293.15 K and the activity coefficient at infinite dilution in water at 298.15 K of organic molecules is presented. The method is based on the complete breakdown of the molecules into their constituting atoms, further subdividing them by their immediate neighborhood. A fast Gauss–Seidel fitting method using experimental data from literature is applied for the calculation of the atom groups’ contributions. Plausibility tests have been carried out on each of the calculations using a ten-fold cross-validation procedure which confirms the excellent predictive quality of the method. The goodness of fit (Q2 and the standard deviation (σ of the cross-validation calculations for the viscosity coefficient, expressed as log(η, was 0.9728 and 0.11, respectively, for 413 test molecules, and for the activity coefficient log(γ∞ the corresponding values were 0.9736 and 0.31, respectively, for 621 test compounds. The present approach has proven its versatility in that it enabled the simultaneous evaluation of the liquid viscosity of normal organic compounds as well as of ionic liquids.

  17. Computational design of molecules for an all-quinone redox flow battery† †Electronic supplementary information (ESI) available: The list of computationally predicted candidate quinone molecules with interesting redox properties. See DOI: 10.1039/c4sc03030c Click here for additional data file.

    Science.gov (United States)

    Er, Süleyman; Suh, Changwon; Marshak, Michael P.

    2015-01-01

    Inspired by the electron transfer properties of quinones in biological systems, we recently showed that quinones are also very promising electroactive materials for stationary energy storage applications. Due to the practically infinite chemical space of organic molecules, the discovery of additional quinones or other redox-active organic molecules for energy storage applications is an open field of inquiry. Here, we introduce a high-throughput computational screening approach that we applied to an accelerated study of a total of 1710 quinone (Q) and hydroquinone (QH2) (i.e., two-electron two-proton) redox couples. We identified the promising candidates for both the negative and positive sides of organic-based aqueous flow batteries, thus enabling an all-quinone battery. To further aid the development of additional interesting electroactive small molecules we also provide emerging quantitative structure-property relationships. PMID:29560173

  18. Computational atomic and nuclear physics

    International Nuclear Information System (INIS)

    Bottcher, C.; Strayer, M.R.; McGrory, J.B.

    1990-01-01

    The evolution of parallel processor supercomputers in recent years provides opportunities to investigate in detail many complex problems, in many branches of physics, which were considered to be intractable only a few years ago. But to take advantage of these new machines, one must have a better understanding of how the computers organize their work than was necessary with previous single processor machines. Equally important, the scientist must have this understanding as well as a good understanding of the structure of the physics problem under study. In brief, a new field of computational physics is evolving, which will be led by investigators who are highly literate both computationally and physically. A Center for Computationally Intensive Problems has been established with the collaboration of the University of Tennessee Science Alliance, Vanderbilt University, and the Oak Ridge National Laboratory. The objective of this Center is to carry out forefront research in computationally intensive areas of atomic, nuclear, particle, and condensed matter physics. An important part of this effort is the appropriate training of students. An early effort of this Center was to conduct a Summer School of Computational Atomic and Nuclear Physics. A distinguished faculty of scientists in atomic, nuclear, and particle physics gave lectures on the status of present understanding of a number of topics at the leading edge in these fields, and emphasized those areas where computational physics was in a position to make a major contribution. In addition, there were lectures on numerical techniques which are particularly appropriate for implementation on parallel processor computers and which are of wide applicability in many branches of science

  19. 1996 Design effort for IFMIF HEBT

    International Nuclear Information System (INIS)

    Blind, B.

    1997-01-01

    The paper details the 1996 design effort for the IFMIF HEBT. Following a brief overview, it lists the primary requirements for the beam at the target, describes the design approach and design tools used, introduces the beamline modules, gives the results achieved with the design at this stage, points out possible improvements and gives the names and computer locations of the TRACE3-D and PARMILA files that sum up the design work. The design does not fully meet specifications in regards to the flatness of the distribution at the target. With further work, including if necessary some backup options, the flatness specifications may be realized. It is not proposed that the specifications, namely flatness to ±5% and higher-intensity ridges that are no more than 15% above average, be changed at this time. The design also does not meet the requirement that the modules of all beamlines should operate at the same settings. However, the goal of using identical components and operational procedures has been met and only minor returning is needed to produce very similar beam distributions from all beamlines. Significant further work is required in the following areas: TRACE3-D designs and PARMILA runs must be made for the beams coming from accelerators No. 3 and No. 4. Transport of 30-MeV and 35-MeV beams to the targets and beam dump must be studied. Comprehensive error studies must be made. These must result in tolerance specifications and may require design iterations. Detailed interfacing with target-spot instrumentation is required. This instrumentation must be able to check all aspects of the specifications

  20. Separate valuation subsystems for delay and effort decision costs.

    Science.gov (United States)

    Prévost, Charlotte; Pessiglione, Mathias; Météreau, Elise; Cléry-Melin, Marie-Laure; Dreher, Jean-Claude

    2010-10-20

    Decision making consists of choosing among available options on the basis of a valuation of their potential costs and benefits. Most theoretical models of decision making in behavioral economics, psychology, and computer science propose that the desirability of outcomes expected from alternative options can be quantified by utility functions. These utility functions allow a decision maker to assign subjective values to each option under consideration by weighting the likely benefits and costs resulting from an action and to select the one with the highest subjective value. Here, we used model-based neuroimaging to test whether the human brain uses separate valuation systems for rewards (erotic stimuli) associated with different types of costs, namely, delay and effort. We show that humans devalue rewards associated with physical effort in a strikingly similar fashion to those they devalue that are associated with delays, and that a single computational model derived from economics theory can account for the behavior observed in both delay discounting and effort discounting. However, our neuroimaging data reveal that the human brain uses distinct valuation subsystems for different types of costs, reflecting in opposite fashion delayed reward and future energetic expenses. The ventral striatum and the ventromedial prefrontal cortex represent the increasing subjective value of delayed rewards, whereas a distinct network, composed of the anterior cingulate cortex and the anterior insula, represent the decreasing value of the effortful option, coding the expected expense of energy. Together, these data demonstrate that the valuation processes underlying different types of costs can be fractionated at the cerebral level.

  1. Manager's effort and endogenous economic discrimination

    Directory of Open Access Journals (Sweden)

    Jaime Orrillo

    2004-09-01

    Full Text Available Assume a labor supply consisting of two types of workers, 1 and 2. Both workers are equally productive and exhibit supply functions with the same elasticity. We consider a firm (entrepreneur or shareholders that is competitive in the output market and monopsonistic in input markets. The firm uses the services of a manager who has a high human capital and whose wage is given by the market. It is supposed that the manager does not like to work with one type of worker, say type 1. If we allow the manager's effort to be an additional input without any extra (in addition to his salary cost for the firm, then the firm's pricing decision will be different for both workers. That is, there will be a wage differential and therefore endogenous economic discrimination2 in the labor markets.Vamos assumir que a oferta de trabalho consiste de dois tipos de trabalhadores, 1 e 2. Ambos os trabalhadores são igualmente produtivos e exibem funções de oferta com a mesma elasticidade. Consideramos uma firma (empresário ou acionistas, a qual é competitiva no mercado de produtos e monopsonista nos mercados de insumos. A firma usa os serviços de um gerente quem tem um alto capital humano e cujo salário é dado pelo mercado. Suponhamos que o gerente não gosta de trabalhar com um tipo de trabalhador, digamos o tipo 1. Se permitirmos que o esforço do gerente seja um insumo adicional sem nenhum custo extra (além de seu salário, a decisão de salários será diferente para ambos os trabalhadores. Isto é, haverá um diferencial de salários e, em conseqüência, uma discriminação econômica1 endógena nos mercados de trabalho.

  2. Pocket money and child effort at school

    OpenAIRE

    François-Charles Wolff; Christine Barnet-Verzat

    2008-01-01

    In this paper, we study the relationship between the provision of parental pocket and the level of effort undertaken by the child at school. Under altruism, an increased amount of parental transfer should reduce the child's effort. Our empirical analysis is based on a French data set including about 1,400 parent-child pairs. We find that children do not undertake less effort when their parents are more generous.

  3. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  4. Deconstructing Hub Drag. Part 2. Computational Development and Anaysis

    Science.gov (United States)

    2013-09-30

    leveraged a Vertical Lift Consortium ( VLC )-funded hub drag scaling research effort. To confirm this objective, correlations are performed with the...Technology™ Demonstrator aircraft using an unstructured computational solver. These simpler faired elliptical geome- tries can prove to be challenging ...possible. However, additional funding was obtained from the Vertical Lift Consortium ( VLC ) to perform this study. This analysis is documented in

  5. Incentive Design and Mis-Allocated Effort

    OpenAIRE

    Schnedler, Wendelin

    2013-01-01

    Incentives often distort behavior: they induce agents to exert effort but this effort is not employed optimally. This paper proposes a theory of incentive design allowing for such distorted behavior. At the heart of the theory is a trade-off between getting the agent to exert effort and ensuring that this effort is used well. The theory covers various moral-hazard models, ranging from traditional single-task to multi-task models. It also provides -for the first time- a formalization and proof...

  6. Teardrop bladder: additional considerations

    International Nuclear Information System (INIS)

    Wechsler, R.J.; Brennan, R.E.

    1982-01-01

    Nine cases of teardrop bladder (TDB) seen at excretory urography are presented. In some of these patients, the iliopsoas muscles were at the upper limit of normal in size, and additional evaluation of the perivesical structures with computed tomography (CT) was necessary. CT demonstrated only hypertrophied muscles with or without perivesical fat. The psoas muscles and pelvic width were measured in 8 patients and compared with the measurements of a control group of males without TDB. Patients with TDB had large iliopsoas muscles and narrow pelves compared with the control group. The psoas muscle width/pelvic width ratio was significantly greater (p < 0.0005) in patients with TDB than in the control group, with values of 1.04 + 0.05 and 0.82 + 0.09, respectively. It is concluded that TDB is not an uncommon normal variant in black males. Both iliopsoas muscle hypertrophy and a narrow pelvis are factors that predispose a patient to TDB

  7. VA Disability Benefits: Additional Planning Would Enhance Efforts to Improve the Timeliness of Appeals Decisions

    Science.gov (United States)

    2017-03-01

    must manually review and correct most incoming cases due to issues with labeling, mismatched dates, and missing files. Via an internal study, VA...individuals acclimate to their jobs —and factored this into the modeling assumptions used to project the number of Board staff needed. More...Needed to Promote Increased User Satisfaction . GAO-15-582 (Washington, D.C.: September 1, 2015). Page 29 GAO-17-234 VA Disability

  8. Control and Effort Costs Influence the Motivational Consequences of Choice

    Directory of Open Access Journals (Sweden)

    Holly Sullivan-Toole

    2017-05-01

    Full Text Available The act of making a choice, apart from any outcomes the choice may yield, has, paradoxically, been linked to both the enhancement and the detriment of intrinsic motivation. Research has implicated two factors in potentially mediating these contradictory effects: the personal control conferred by a choice and the costs associated with a choice. Across four experiments, utilizing a physical effort task disguised as a simple video game, we systematically varied costs across two levels of physical effort requirements (Low-Requirement, High-Requirement and control over effort costs across three levels of choice (Free-Choice, Restricted-Choice, and No-Choice to disambiguate how these factors affect the motivational consequences of choosing within an effortful task. Together, our results indicated that, in the face of effort requirements, illusory control alone may not sufficiently enhance perceptions of personal control to boost intrinsic motivation; rather, the experience of actual control may be necessary to overcome effort costs and elevate performance. Additionally, we demonstrated that conditions of illusory control, while otherwise unmotivating, can through association with the experience of free-choice, be transformed to have a positive effect on motivation.

  9. The influence of music on mental effort and driving performance.

    Science.gov (United States)

    Ünal, Ayça Berfu; Steg, Linda; Epstude, Kai

    2012-09-01

    The current research examined the influence of loud music on driving performance, and whether mental effort mediated this effect. Participants (N=69) drove in a driving simulator either with or without listening to music. In order to test whether music would have similar effects on driving performance in different situations, we manipulated the simulated traffic environment such that the driving context consisted of both complex and monotonous driving situations. In addition, we systematically kept track of drivers' mental load by making the participants verbally report their mental effort at certain moments while driving. We found that listening to music increased mental effort while driving, irrespective of the driving situation being complex or monotonous, providing support to the general assumption that music can be a distracting auditory stimulus while driving. However, drivers who listened to music performed as well as the drivers who did not listen to music, indicating that music did not impair their driving performance. Importantly, the increases in mental effort while listening to music pointed out that drivers try to regulate their mental effort as a cognitive compensatory strategy to deal with task demands. Interestingly, we observed significant improvements in driving performance in two of the driving situations. It seems like mental effort might mediate the effect of music on driving performance in situations requiring sustained attention. Other process variables, such as arousal and boredom, should also be incorporated to study designs in order to reveal more on the nature of how music affects driving. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Time preferences, study effort, and academic performance

    NARCIS (Netherlands)

    Non, J.A.; Tempelaar, D.T.

    2014-01-01

    We analyze the relation between time preferences, study effort, and academic performance among first-year Business and Economics students. Time preferences are measured by stated preferences for an immediate payment over larger delayed payments. Data on study efforts are derived from an electronic

  11. Interests, Effort, Achievement and Vocational Preference.

    Science.gov (United States)

    Sjoberg, L.

    1984-01-01

    Relationships between interest in natural sciences and technology and perceived ability, success, and invested effort were studied in Swedish secondary school students. Interests were accounted for by logical orientation and practical value. Interests and grades were strongly correlated, but correlations between interests and effort and vocational…

  12. Dopamine and Effort-Based Decision Making

    Directory of Open Access Journals (Sweden)

    Irma Triasih Kurniawan

    2011-06-01

    Full Text Available Motivational theories of choice focus on the influence of goal values and strength of reinforcement to explain behavior. By contrast relatively little is known concerning how the cost of an action, such as effort expended, contributes to a decision to act. Effort-based decision making addresses how we make an action choice based on an integration of action and goal values. Here we review behavioral and neurobiological data regarding the representation of effort as action cost, and how this impacts on decision making. Although organisms expend effort to obtain a desired reward there is a striking sensitivity to the amount of effort required, such that the net preference for an action decreases as effort cost increases. We discuss the contribution of the neurotransmitter dopamine (DA towards overcoming response costs and in enhancing an animal’s motivation towards effortful actions. We also consider the contribution of brain structures, including the basal ganglia (BG and anterior cingulate cortex (ACC, in the internal generation of action involving a translation of reward expectation into effortful action.

  13. Listening Effort With Cochlear Implant Simulations

    NARCIS (Netherlands)

    Pals, Carina; Sarampalis, Anastasios; Başkent, Deniz

    2013-01-01

    Purpose: Fitting a cochlear implant (CI) for optimal speech perception does not necessarily optimize listening effort. This study aimed to show that listening effort may change between CI processing conditions for which speech intelligibility remains constant. Method: Nineteen normal-hearing

  14. Effort and Selection Effects of Incentive Contracts

    NARCIS (Netherlands)

    Bouwens, J.F.M.G.; van Lent, L.A.G.M.

    2003-01-01

    We show that the improved effort of employees associated with incentive contracts depends on the properties of the performance measures used in the contract.We also find that the power of incentives in the contract is only indirectly related to any improved employee effort.High powered incentive

  15. The Effect of Age on Listening Effort

    Science.gov (United States)

    Degeest, Sofie; Keppler, Hannah; Corthals, Paul

    2015-01-01

    Purpose: The objective of this study was to investigate the effect of age on listening effort. Method: A dual-task paradigm was used to evaluate listening effort in different conditions of background noise. Sixty adults ranging in age from 20 to 77 years were included. A primary speech-recognition task and a secondary memory task were performed…

  16. Career Opportunities in Computer Graphics.

    Science.gov (United States)

    Langer, Victor

    1983-01-01

    Reviews the impact of computer graphics on industrial productivity. Details the computer graphics technician curriculum at Milwaukee Area Technical College and the cooperative efforts of business and industry to fund and equip the program. (SK)

  17. Low-effort thought promotes political conservatism.

    Science.gov (United States)

    Eidelman, Scott; Crandall, Christian S; Goodman, Jeffrey A; Blanchar, John C

    2012-06-01

    The authors test the hypothesis that low-effort thought promotes political conservatism. In Study 1, alcohol intoxication was measured among bar patrons; as blood alcohol level increased, so did political conservatism (controlling for sex, education, and political identification). In Study 2, participants under cognitive load reported more conservative attitudes than their no-load counterparts. In Study 3, time pressure increased participants' endorsement of conservative terms. In Study 4, participants considering political terms in a cursory manner endorsed conservative terms more than those asked to cogitate; an indicator of effortful thought (recognition memory) partially mediated the relationship between processing effort and conservatism. Together these data suggest that political conservatism may be a process consequence of low-effort thought; when effortful, deliberate thought is disengaged, endorsement of conservative ideology increases.

  18. Additively Manufactured Ceramic Rocket Engine Components

    Data.gov (United States)

    National Aeronautics and Space Administration — HRL Laboratories, LLC, with Vector Space Systems (VSS) as subcontractor, has a 24-month effort to develop additive manufacturing technology for reinforced ceramic...

  19. A future for computational fluid dynamics at CERN

    CERN Document Server

    Battistin, M

    2005-01-01

    Computational Fluid Dynamics (CFD) is an analysis of fluid flow, heat transfer and associated phenomena in physical systems using computers. CFD has been used at CERN since 1993 by the TS-CV group, to solve thermo-fluid related problems, particularly during the development, design and construction phases of the LHC experiments. Computer models based on CFD techniques can be employed to reduce the effort required for prototype testing, saving not only time and money but offering possibilities of additional investigations and design optimisation. The development of a more efficient support team at CERN depends on to two important factors: available computing power and experienced engineers. Available computer power IS the limiting resource of CFD. Only the recent increase of computer power had allowed important high tech and industrial applications. Computer Grid is already now (OpenLab at CERN) and will be more so in the future natural environment for CFD science. At CERN, CFD activities have been developed by...

  20. Programming effort analysis of the ELLPACK language

    Science.gov (United States)

    Rice, J. R.

    1978-01-01

    ELLPACK is a problem statement language and system for elliptic partial differential equations which is implemented by a FORTRAN preprocessor. ELLPACK's principal purpose is as a tool for the performance evaluation of software. However, it is used here as an example with which to study the programming effort required for problem solving. It is obvious that problem statement languages can reduce programming effort tremendously; the goal is to quantify this somewhat. This is done by analyzing the lengths and effort (as measured by Halstead's software science technique) of various approaches to solving these problems.

  1. Pitfalls in Designing Zero-Effort Deauthentication: Opportunistic Human Observation Attacks

    OpenAIRE

    Huhta, O.; Shrestha, P.; Udar, S.; Juuti, M.; Saxena, N.; Asokan, N.

    2015-01-01

    VK: Asokan, N. Deauthentication is an important component of any authentication system. The widespread use of computing devices in daily life has underscored the need for zero-effort deauthentication schemes. However, the quest for eliminating user effort may lead to hidden security flaws in the authentication schemes. As a case in point, we investigate a prominent zero-effort deauthentication scheme, called ZEBRA, which provides an interesting and a useful solution to a difficult problem ...

  2. Illinois highway materials sustainability efforts of 2015.

    Science.gov (United States)

    2016-08-01

    This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2015. This report meets the requirements of Illinois Publ...

  3. Illinois highway materials sustainability efforts of 2014.

    Science.gov (United States)

    2015-08-01

    This report presents the 2014 sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling reclaimed materials in highway construction. This report meets the requirements of Illinois : Public Act 097-0314 by documenting I...

  4. Illinois highway materials sustainability efforts of 2016.

    Science.gov (United States)

    2017-07-04

    This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2016. This report meets the requirements of Illinois Publ...

  5. Illinois highway materials sustainability efforts of 2013.

    Science.gov (United States)

    2014-08-01

    This report presents the sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling and reclaiming materials for use in highway construction. This report meets the requirements of : Illinois Public Act 097-0314 by docum...

  6. IAEA Patient Protection Effort Reaches Key Milestone

    International Nuclear Information System (INIS)

    2012-01-01

    Full text: An International Atomic Energy Agency (IAEA) effort to help people track their radiation exposure from medical procedures achieved a significant milestone this week. The Agency received the final approval from a group of medical oversight organizations for the 'Joint Position Statement on the IAEA Patient Radiation Exposure Tracking', a set of principles to guide patient protection efforts at the sub-national, national, and international level. The joint statement endorses the IAEA's three-year-old Smart Card/SmartRadTrack project, which aims to help nations develop systems to track medical radiation procedures and radiation doses. The statement has been agreed by the World Health Organization (WHO), the U.S. Food and Drug Administration (FDA), the European Society of Radiology (ESR), the International Organization for Medical Physics (IOMP), the International Society of Radiographers and Radiological Technologists (ISRRT), and the Conference of Radiation Control Program Directors, USA (CRCPD). 'This system is critical if the medical community is going to keep patients safe when they are being referred for more and more diagnostic scans. These scans, over the years, are made using more and more powerful machines', said Madan Rehani, Radiation Safety Specialist in the IAEA's Radiation Protection of Patients Unit. 'The tracking system will draw doctors' attention to previous radiological examinations, both in terms of clinical information and radiation dose and thus help them assess whether the 11th or 20th CT scan is really appropriate, whether it will do more good than harm.' Advances in radiation-based diagnostic technologies, such as the CT scan, have led to patients receiving such procedures more frequently. The convenience of CT with the added advantage of increased information has resulted in increased usage to the point that there are instances of patients getting tens of CT scans in a few years, not all of which may be justified, or getting CT

  7. POEM: Identifying joint additive effects on regulatory circuits

    Directory of Open Access Journals (Sweden)

    Maya eBotzman

    2016-04-01

    Full Text Available Motivation: Expression Quantitative Trait Locus (eQTL mapping tackles the problem of identifying variation in DNA sequence that have an effect on the transcriptional regulatory network. Major computational efforts are aimed at characterizing the joint effects of several eQTLs acting in concert to govern the expression of the same genes. Yet, progress towards a comprehensive prediction of such joint effects is limited. For example, existing eQTL methods commonly discover interacting loci affecting the expression levels of a module of co-regulated genes. Such ‘modularization’ approaches, however, are focused on epistatic relations and thus have limited utility for the case of additive (non-epistatic effects.Results: Here we present POEM (Pairwise effect On Expression Modules, a methodology for identifying pairwise eQTL effects on gene modules. POEM is specifically designed to achieve high performance in the case of additive joint effects. We applied POEM to transcription profiles measured in bone marrow-derived dendritic cells across a population of genotyped mice. Our study reveals widespread additive, trans-acting pairwise effects on gene modules, characterizes their organizational principles, and highlights high-order interconnections between modules within the immune signaling network. These analyses elucidate the central role of additive pairwise effect in regulatory circuits, and provide computational tools for future investigations into the interplay between eQTLs.Availability: The software described in this article is available at csgi.tau.ac.il/POEM/.

  8. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  9. The (cost-)effectiveness of a lifestyle physical activity intervention in addition to a work style intervention on the recovery from neck and upper limb symptoms in computer workers

    NARCIS (Netherlands)

    Bernaards, C.M.; Ariëns, G.A.M.; Hildebrandt, V.H.

    2006-01-01

    Background: Neck and upper limb symptoms are frequently reported by computer workers. Work style interventions are most commonly used to reduce work-related neck and upper limb symptoms but lifestyle physical activity interventions are becoming more popular to enhance workers health and reduce

  10. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  11. The role of cognitive effort in subjective reward devaluation and risky decision-making.

    Science.gov (United States)

    Apps, Matthew A J; Grima, Laura L; Manohar, Sanjay; Husain, Masud

    2015-11-20

    Motivation is underpinned by cost-benefit valuations where costs-such as physical effort or outcome risk-are subjectively weighed against available rewards. However, in many environments risks pertain not to the variance of outcomes, but to variance in the possible levels of effort required to obtain rewards (effort risks). Moreover, motivation is often guided by the extent to which cognitive-not physical-effort devalues rewards (effort discounting). Yet, very little is known about the mechanisms that underpin the influence of cognitive effort risks or discounting on motivation. We used two cost-benefit decision-making tasks to probe subjective sensitivity to cognitive effort (number of shifts of spatial attention) and to effort risks. Our results show that shifts of spatial attention when monitoring rapidly presented visual stimuli are perceived as effortful and devalue rewards. Additionally, most people are risk-averse, preferring safe, known amounts of effort over risky offers. However, there was no correlation between their effort and risk sensitivity. We show for the first time that people are averse to variance in the possible amount of cognitive effort to be exerted. These results suggest that cognitive effort sensitivity and risk sensitivity are underpinned by distinct psychological and neurobiological mechanisms.

  12. ESA NEOCC effort to eliminate high Palermo Scale virtual impactors

    Science.gov (United States)

    Micheli, M.; Koschny, D.; Hainaut, O.; Bernardi, F.

    2014-07-01

    At the moment of this writing about 4 % of the known near-Earth objects are known to have at least one future close approach scenario with a non-negligible collision probability within the next century, as routinely computed by the NEODyS and Sentry systems. The most straightforward way to improve the knowledge of the future dynamics of an NEO in order to exclude (or possibly confirm) some of these possible future impact is to obtain additional astrometric observations of the object as soon as it becomes observable again. In particular, since a large fraction (>98 %) of the known objects currently recognized as possible future impactors have been observed during a single opposition, this usually corresponds to obtaining a new set of observations during a second opposition, a so called ''recovery''. However, in some cases the future observability windows for the target after the discovery apparition may be very limited, either because the object is intrinsically small (and therefore requires a very close and consequently rare approach to become observable) or because its orbital dynamic prevents the observability from the ground for a long timespan (as in the case of quasi-resonant objects with a long synodic period). When this happens, the only short-term way to clarify an impact scenario is to look toward the past, and investigate the possibility that unrecognized detections of the object are already present in the databases of old astronomical images, which are often archived by professional telescopes and made available to the community a few months to years after they are exposed. We will here present an effort lead by the newly formed ESA NEO Coordination Centre (NEOCC) in Frascati to pursue both these avenues with the intent of improving the orbital knowledge of the highest-rated possible impactors, as defined by the Palermo Technical Impact Hazard Scale (PS in the following). As an example of our ongoing observational activities, we will first present our

  13. Food additives and preschool children.

    Science.gov (United States)

    Martyn, Danika M; McNulty, Breige A; Nugent, Anne P; Gibney, Michael J

    2013-02-01

    Food additives have been used throughout history to perform specific functions in foods. A comprehensive framework of legislation is in place within Europe to control the use of additives in the food supply and ensure they pose no risk to human health. Further to this, exposure assessments are regularly carried out to monitor population intakes and verify that intakes are not above acceptable levels (acceptable daily intakes). Young children may have a higher dietary exposure to chemicals than adults due to a combination of rapid growth rates and distinct food intake patterns. For this reason, exposure assessments are particularly important in this age group. The paper will review the use of additives and exposure assessment methods and examine factors that affect dietary exposure by young children. One of the most widely investigated unfavourable health effects associated with food additive intake in preschool-aged children are suggested adverse behavioural effects. Research that has examined this relationship has reported a variety of responses, with many noting an increase in hyperactivity as reported by parents but not when assessed using objective examiners. This review has examined the experimental approaches used in such studies and suggests that efforts are needed to standardise objective methods of measuring behaviour in preschool children. Further to this, a more holistic approach to examining food additive intakes by preschool children is advisable, where overall exposure is considered rather than focusing solely on behavioural effects and possibly examining intakes of food additives other than food colours.

  14. Instruction Emphasizing Effort Improves Physics Problem Solving

    Science.gov (United States)

    Li, Daoquan

    2012-01-01

    Effectively using strategies to solve complex problems is an important educational goal and is implicated in successful academic performance. However, people often do not spontaneously use the effective strategies unless they are motivated to do so. The present study was designed to test whether educating students about the importance of effort in…

  15. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  16. Net benefits of wildfire prevention education efforts

    Science.gov (United States)

    Jeffrey P. Prestemon; David T. Butry; Karen L. Abt; Ronda Sutphen

    2010-01-01

    Wildfire prevention education efforts involve a variety of methods, including airing public service announcements, distributing brochures, and making presentations, which are intended to reduce the occurrence of certain kinds of wildfires. A Poisson model of preventable Florida wildfires from 2002 to 2007 by fire management region was developed. Controlling for...

  17. Has Malaysia's antidrug effort been effective?

    Science.gov (United States)

    Scorzelli, J F

    1992-01-01

    It is a common belief that a massive effort in law enforcement, preventive education and rehabilitation will result in the elimination of a country's drug problem. Based on this premise. Malaysia in 1983 implemented such a multifaceted anti-drug strategy, and the results of a 1987 study by the author suggested that Malaysia's effort had begun to contribute to a steady decrease in the number of identified drug abusers. Although the number of drug-addicted individuals declined, the country's recidivism rates were still high. Because of this high relapse rate, Malaysia expanded their rehabilitation effort and developed a community transition program. In order to determine the impact of these changes on the country's battle against drug abuse, a follow-up study was conducted in 1990. The results of this study did not clearly demonstrate that the Malaysian effort had been successful in eliminating the problem of drug abuse, and raised some questions concerning the effectiveness of the country's drug treatment programs.

  18. Phase transitions in least-effort communications

    International Nuclear Information System (INIS)

    Prokopenko, Mikhail; Ay, Nihat; Obst, Oliver; Polani, Daniel

    2010-01-01

    We critically examine a model that attempts to explain the emergence of power laws (e.g., Zipf's law) in human language. The model is based on the principle of least effort in communications—specifically, the overall effort is balanced between the speaker effort and listener effort, with some trade-off. It has been shown that an information-theoretic interpretation of this principle is sufficiently rich to explain the emergence of Zipf's law in the vicinity of the transition between referentially useless systems (one signal for all referable objects) and indexical reference systems (one signal per object). The phase transition is defined in the space of communication accuracy (information content) expressed in terms of the trade-off parameter. Our study explicitly solves the continuous optimization problem, subsuming a recent, more specific result obtained within a discrete space. The obtained results contrast Zipf's law found by heuristic search (that attained only local minima) in the vicinity of the transition between referentially useless systems and indexical reference systems, with an inverse-factorial (sub-logarithmic) law found at the transition that corresponds to global minima. The inverse-factorial law is observed to be the most representative frequency distribution among optimal solutions

  19. The Galileo Teacher Training Program Global Efforts

    Science.gov (United States)

    Doran, R.; Pennypacker, C.; Ferlet, R.

    2012-08-01

    The Galileo Teacher Training Program (GTTP) successfully named representatives in nearly 100 nations in 2009, the International Year of Astronomy (IYA2009). The challenge had just begun. The steps ahead are how to reach educators that might benefit from our program and how to help build a more fair and science literate society, a society in which good tools and resources for science education are not the privilege of a few. From 2010 on our efforts have been to strengthen the newly formed network and learn how to equally help educators and students around the globe. New partnerships with other strong programs and institutions are being formed, sponsorship schemes being outlined, new tools and resources being publicized, and on-site and video conference training conducted all over the world. Efforts to officially accredit a GTTP curriculum are on the march and a stronger certification process being outlined. New science topics are being integrated in our effort and we now seek to discuss the path ahead with experts in this field and the community of users, opening the network to all corners of our beautiful blue dot. The main aim of this article is to open the discussion regarding the urgent issue of how to reawaken student interest in science, how to solve the gender inequality in science careers, and how to reach the underprivileged students and open to them the same possibilities. Efforts are in strengthening the newly formed network and learning how to equally help educators and students around the globe.

  20. Effort - Final technical report on task 3

    DEFF Research Database (Denmark)

    Bay, Niels; Henningsen, Poul; Eriksen, Morten

    The present report is documentation for the work carried out at DTU on the Brite/Euram project No. BE96-3340, contract No. BRPR-CT97-0398, with the title Enhanced Framework for forging design using reliable three-dimensional simulation (EFFORTS). The objective of task 3 is to determine data...

  1. Workplace High Tech Spurs Retraining Efforts.

    Science.gov (United States)

    Davis, Dwight B.

    1984-01-01

    Discusses who should provide training for displaced workers who need new skills. Areas examined include: (1) the need for retraining; (2) current corporate efforts; (3) agreements in the automotive industry; (4) job quality; (5) the federal government's role; and (6) federal legislation related to the problem. (JN)

  2. Testosterone and reproductive effort in male primates.

    Science.gov (United States)

    Muller, Martin N

    2017-05-01

    Considerable evidence suggests that the steroid hormone testosterone mediates major life-history trade-offs in vertebrates, promoting mating effort at the expense of parenting effort or survival. Observations from a range of wild primates support the "Challenge Hypothesis," which posits that variation in male testosterone is more closely associated with aggressive mating competition than with reproductive physiology. In both seasonally and non-seasonally breeding species, males increase testosterone production primarily when competing for fecund females. In species where males compete to maintain long-term access to females, testosterone increases when males are threatened with losing access to females, rather than during mating periods. And when male status is linked to mating success, and dependent on aggression, high-ranking males normally maintain higher testosterone levels than subordinates, particularly when dominance hierarchies are unstable. Trade-offs between parenting effort and mating effort appear to be weak in most primates, because direct investment in the form of infant transport and provisioning is rare. Instead, infant protection is the primary form of paternal investment in the order. Testosterone does not inhibit this form of investment, which relies on male aggression. Testosterone has a wide range of effects in primates that plausibly function to support male competitive behavior. These include psychological effects related to dominance striving, analgesic effects, and effects on the development and maintenance of the armaments and adornments that males employ in mating competition. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Reasonable limits to radiation protection efforts

    International Nuclear Information System (INIS)

    Gonen, Y.G.

    1982-01-01

    It is shown that change in life expectancy (ΔLE) is an improved estimate for risks and safety efforts, reflecting the relevant social goal. A cost-effectiveness index, safety investment/ΔLE, is defined. The harm from low level radiation is seen as a reduction of life expectancy instead of an increased probability of contracting cancer. (author)

  4. The Effects of Hearing Aid Directional Microphone and Noise Reduction Processing on Listening Effort in Older Adults with Hearing Loss.

    Science.gov (United States)

    Desjardins, Jamie L

    2016-01-01

    Older listeners with hearing loss may exert more cognitive resources to maintain a level of listening performance similar to that of younger listeners with normal hearing. Unfortunately, this increase in cognitive load, which is often conceptualized as increased listening effort, may come at the cost of cognitive processing resources that might otherwise be available for other tasks. The purpose of this study was to evaluate the independent and combined effects of a hearing aid directional microphone and a noise reduction (NR) algorithm on reducing the listening effort older listeners with hearing loss expend on a speech-in-noise task. Participants were fitted with study worn commercially available behind-the-ear hearing aids. Listening effort on a sentence recognition in noise task was measured using an objective auditory-visual dual-task paradigm. The primary task required participants to repeat sentences presented in quiet and in a four-talker babble. The secondary task was a digital visual pursuit rotor-tracking test, for which participants were instructed to use a computer mouse to track a moving target around an ellipse that was displayed on a computer screen. Each of the two tasks was presented separately and concurrently at a fixed overall speech recognition performance level of 50% correct with and without the directional microphone and/or the NR algorithm activated in the hearing aids. In addition, participants reported how effortful it was to listen to the sentences in quiet and in background noise in the different hearing aid listening conditions. Fifteen older listeners with mild sloping to severe sensorineural hearing loss participated in this study. Listening effort in background noise was significantly reduced with the directional microphones activated in the hearing aids. However, there was no significant change in listening effort with the hearing aid NR algorithm compared to no noise processing. Correlation analysis between objective and self

  5. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  6. O ensino de reações orgânicas usando química computacional: I. reações de adição eletrofílica a alquenos Teaching organic reactions using computational chemistry: I. eletrophilic addition reactions to alkenes

    Directory of Open Access Journals (Sweden)

    Arquimedes Mariano

    2008-01-01

    Full Text Available Basic concepts that play an important role in some organic reactions are revisited in this paper, which reports a pedagogical experience involving undergraduate and graduate students. A systematic procedure has been applied in order to use widespread available computational tools. This paper aims to discuss the use of computers in teaching electrophilic addition reactions to alkenes. Two classical examples have been investigated: addition to non-conjugated alkenes and addition to conjugated dienes. The results were compared with those normally discussed in organic textbooks. Several important concepts, such as conformational analysis and energy control (kinetic and thermodynamic involved in reaction mechanisms can be taught more efficiently if one connects theoretical and practical tools.

  7. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  8. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  9. Non-proliferation efforts in South Asia

    International Nuclear Information System (INIS)

    Chellaney, B.

    1994-01-01

    Southern Asia is one of the most volatile regions in the world because of inter-State and intra-State conflicts. Security in the region highly depends on the rival capabilities of the involved states, Pakistan, India, China. Increased Confidence building and nuclear transparency are becoming more significant issues in attaining stability in the region, although non-proliferation efforts in this region have attained little headway

  10. Some recent efforts toward high density implosions

    International Nuclear Information System (INIS)

    McClellan, G.E.

    1980-01-01

    Some recent Livermore efforts towards achieving high-density implosions are presented. The implosion dynamics necessary to compress DT fuel to 10 to 100 times liquid density are discussed. Methods of diagnosing the maximum DT density for a specific design are presented along with results to date. The dynamics of the double-shelled target with an exploding outer shell are described, and some preliminary experimental results are presented

  11. Evaluative language, cognitive effort and attitude change.

    OpenAIRE

    van der Pligt, J.; van Schie, E.C.M.; Martijn, C.

    1994-01-01

    Tested the hypotheses that evaluatively biased language influences attitudes and that the magnitude and persistence of attitude change depends on the amount of cognitive effort. 132 undergraduates participated in the experiment, which used material focusing on the issue of restricting adolescent driving over the weekends to reduce the number of fatal traffic accidents. Results indicate that evaluatively biased language can affect attitudes. Using words that evaluate the pro-position positivel...

  12. Efforts to identify spore forming bacillus

    Energy Technology Data Exchange (ETDEWEB)

    Zuleiha, M.S.; Hilmy, N. (National Atomic Energy Agency, Jakarta (Indonesia). Pasar Djumat Research Centre)

    1982-04-01

    Efforts to identify 47 species of radioresistant spore forming bacillus sp. isolated from locally produced medical devices have been carried out. The identifications was conducted using 19 kinds of biochemical tests and compared to species to bacillus subtilis W. T.; bacillus pumilus E 601 and bacillus sphaericus Csub(I)A. The results showed that bacillus sp. examined could be divided into 6 groups, i.e. bacillus cereus; bacillus subtilis; bacillus stearothermophylus; bacillus coagulans; bacillus sphaericus and bacillus circulans.

  13. Efforts to identify spore forming bacillus

    International Nuclear Information System (INIS)

    Zuleiha, M.S.; Hilmy, Nazly

    1982-01-01

    Efforts to identify 47 species of radioresistant spore forming bacillus sp. isolated from locally produced medical devices have been carried out. The identifications was conducted using 19 kinds of biochemical tests and compared to species to bacillus subtilis W. T.; bacillus pumilus E 601 and bacillus sphaericus Csub(I)A. The results showed that bacillus sp. examined could be divided into 6 groups, i.e. bacillus cereus; bacillus subtilis; bacillus stearothermophylus; bacillus coagulans; bacillus sphaericus and bacillus circulans. (author)

  14. Duke Power's liquid radwaste processing improvement efforts

    International Nuclear Information System (INIS)

    Baker, R.E. Jr.; Bramblett, J.W.

    1995-01-01

    The rising cost of processing liquid radwaste and industry efforts to reduce offsite isotopic contributions has drawn greater attention to the liquid radwaste area. Because of economic pressures to reduce cost and simultaneously improve performance, Duke Power has undertaken a wide ranging effort to cost effectively achieve improvements in the liquid radwaste processing area. Duke Power has achieved significant reductions over recent years in the release of curies to the environment from the Liquid Radwaste Treatmentt systems at its Catawba, McGuire, and Oconee stations. System wide site curie reductions of 78% have been achieved in a 3 year period. These curie reductions have been achieved while simultaneously reducing the amount of media used to accomplish treatment. The curie and media usage reductions have been achieved at low capital cost expenditures. A large number of approaches and projects have been used to achieve these curie and media usage reductions. This paper will describe the various projects and the associated results for Duke Power's processing improvement efforts. The subjects/projects which will be described include: (1) Cooperative philosophy between stations (2) Source Control (3) Processing Improvements (4) Technology Testing

  15. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  16. Nonparametric additive regression for repeatedly measured data

    KAUST Repository

    Carroll, R. J.; Maity, A.; Mammen, E.; Yu, K.

    2009-01-01

    We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements

  17. Reconstructed and analyzed X-ray computed tomography data of investment-cast and additive-manufactured aluminum foam for visualizing ligament failure mechanisms and regions of contact during a compression test

    Directory of Open Access Journals (Sweden)

    Kristoffer E. Matheson

    2018-02-01

    Full Text Available Three stochastic open-cell aluminum foam samples were incrementally compressed and imaged using X-ray Computed Tomography (CT. One of the samples was created using conventional investment casting methods and the other two were replicas of the same foam that were made using laser powder bed fusion. The reconstructed CT data were then examined in Paraview to identify and highlight the types of failure of individual ligaments. The accompanying sets of Paraview state files and STL files highlight the different ligament failure modes incrementally during compression for each foam. Ligament failure was classified as either “Fracture” (red or “Collapse” (blue. Also, regions of neighboring ligaments that came into contact that were not originally touching were colored yellow. For further interpretation and discussion of the data, please refer to Matheson et al. (2017 [1].

  18. Brain and effort: brain activation and effort-related working memory in healthy participants and patients with working memory deficits

    Directory of Open Access Journals (Sweden)

    Maria eEngstrom

    2013-04-01

    Full Text Available Despite the interest in the neuroimaging of working memory, little is still known about the neurobiology of complex working memory in tasks that require simultaneous manipulation and storage of information. In addition to the central executive network, we assumed that the recently described salience network (involving the anterior insular cortex and the anterior cingulate cortex might be of particular importance to working memory tasks that require complex, effortful processing. Method: Healthy participants (n=26 and participants suffering from working memory problems related to the Kleine-Levin syndrome (a specific form of periodic idiopathic hypersomnia; n=18 participated in the study. Participants were further divided into a high and low capacity group, according to performance on a working memory task (listening span. In a functional Magnetic Resonance Imaging (fMRI study, participants were administered the reading span complex working memory task tapping cognitive effort. Principal findings: The fMRI-derived blood oxygen level dependent (BOLD signal was modulated by 1 effort in both the central executive and the salience network and 2 capacity in the salience network in that high performers evidenced a weaker BOLD signal than low performers. In the salience network there was a dichotomy between the left and the right hemisphere; the right hemisphere elicited a steeper increase of the BOLD signal as a function of increasing effort. There was also a stronger functional connectivity within the central executive network because of increased task difficulty. Conclusion: The ability to allocate cognitive effort in complex working memory is contingent upon focused resources in the executive and in particular the salience network. Individual capacity during the complex working memory task is related to activity in the salience (but not the executive network so that high-capacity participants evidence a lower signal and possibly hence a larger

  19. Detection of myocardial viability by means of Single Proton Emission Computed Tomography (Perfused SPECT) dual 201 Tl (rest of 15 minutes, 24 late hours and 24 hours reinjection) and gated-SPECT 99m Tc-SESTAMIBI in effort or stimulation of the coronary reserves

    International Nuclear Information System (INIS)

    Mendoza V, R.

    2004-01-01

    The objective of this work was to determine if the images of SPECT 201 TI in rest of 15 minutes, 24 late hours and Gated-SPECT 99m Tc-SESTAMIBI in effort or stimulation of coronary reservation correlate with the study of 24 hours post reinjection of 201 TI to determine the presence of having knitted viable myocardium. Material and methods: 29 patients were studied with coronary arterial illness (EAC) to who are carried out SPECT 201 TI in rest with images of 15 minutes, 24 late hours and 24 hours reinjection, by means of the administration of 201 TI to dose of 130 MBq and reinjection with 37 MBq. and Gated-SPECT 99m Tc-SESTAMIBI in effort or stimulation of coronary reservation, later to the administration of 1110 MBq. Results: 29 patients were included according to inclusion approaches and exclusion, of those which 22 (75.86%) they correspond at the masculine sex and 7 (24.13%) to the feminine one, with an average of 62.1 year-old age, 2320 segments myocardial were analysed so much it is phase post-effort as rest; they were diagnosed a total of 264 segments with heart attack, of which viability myocardium was observed in 174 segments. The statistical tests are analysis of frequencies. The non parametric test of Wilcoxon and Mann-Whitney. Conclusions: the viability myocardial at the 24 late hours and 24 hours reinjection was similar; significant difference exists between the study of 15 minutes and 24 hours reinjection, ischemic illness was also demonstrated in territories different to the heart attack area in the studies of 15 minutes, late 24 hours and 24 hours reinjection. (Author)

  20. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  1. Additive Manufacturing Infrared Inspection

    Science.gov (United States)

    Gaddy, Darrell; Nettles, Mindy

    2015-01-01

    The Additive Manufacturing Infrared Inspection Task started the development of a real-time dimensional inspection technique and digital quality record for the additive manufacturing process using infrared camera imaging and processing techniques. This project will benefit additive manufacturing by providing real-time inspection of internal geometry that is not currently possible and reduce the time and cost of additive manufactured parts with automated real-time dimensional inspections which deletes post-production inspections.

  2. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  3. Pioneering efforts to control AIDS. Review: IHO.

    Science.gov (United States)

    Chatterji, A; Sehgal, K

    1995-01-01

    The Indian Health Organisation (IHO) is a nongovernmental organization based in Bombay with more than 12 years experience in HIV/AIDS prevention and control efforts. It has attacked ignorance and prejudice via communication efforts. IHO has created a bond with some hospital systems of Bombay. IHO disseminated information about HIV/AIDS in Bombay's red light districts and has bridged the gap between the city's medical establishment and the community most in need. IHO's aggressive street-level fighting in a sector replete with sensitive issues has somewhat isolated it from mainstream national NGOs involved in HIV/AIDS education and control as well as from the medical establishment and potential partners. IHO funds have been reduced, forcing IHO to reduce intervention programs and responses to field demands. It suffers from a high rate of turnover among middle management staff. IHO's chief advantage is its confidence gained over the past 12 years. IHO has clearly delineated the direction it wants to go: care and support programs for persons affected by HIV/AIDS and for commercial sex workers to allow them to quit prostitution, orphan care, and development of training institutions for the education and motivation of medical personnel on HIV/AIDS care and prevention. It plans to build a hospice for AIDS patients and orphans and a training center. Training activities will vary from one-week orientation programs to three-month certificate courses for medical workers, NGOs, and managers from the commercial sector. IHO is prepared to share its experiences in combating HIV/AIDS in Bombay in a team effort. As official and bilateral funding has been decreasing, IHO has targeted industry for funding. Industry has responded, which enables IHO to sustain its core programs and approaches. IHO observations show a decrease in the number of men visiting red-light districts. IHO enjoys a positive relationship with Bombay's media reporting on AIDS.

  4. Economic growth, biodiversity loss and conservation effort.

    Science.gov (United States)

    Dietz, Simon; Adger, W Neil

    2003-05-01

    This paper investigates the relationship between economic growth, biodiversity loss and efforts to conserve biodiversity using a combination of panel and cross section data. If economic growth is a cause of biodiversity loss through habitat transformation and other means, then we would expect an inverse relationship. But if higher levels of income are associated with increasing real demand for biodiversity conservation, then investment to protect remaining diversity should grow and the rate of biodiversity loss should slow with growth. Initially, economic growth and biodiversity loss are examined within the framework of the environmental Kuznets hypothesis. Biodiversity is represented by predicted species richness, generated for tropical terrestrial biodiversity using a species-area relationship. The environmental Kuznets hypothesis is investigated with reference to comparison of fixed and random effects models to allow the relationship to vary for each country. It is concluded that an environmental Kuznets curve between income and rates of loss of habitat and species does not exist in this case. The role of conservation effort in addressing environmental problems is examined through state protection of land and the regulation of trade in endangered species, two important means of biodiversity conservation. This analysis shows that the extent of government environmental policy increases with economic development. We argue that, although the data are problematic, the implications of these models is that conservation effort can only ever result in a partial deceleration of biodiversity decline partly because protected areas serve multiple functions and are not necessarily designated to protect biodiversity. Nevertheless institutional and policy response components of the income biodiversity relationship are important but are not well captured through cross-country regression analysis.

  5. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yang, Jen-Hau; Rotolo, Renee; Presby, Rose

    2018-01-01

    Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA) system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson's disease). Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  6. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research

    Directory of Open Access Journals (Sweden)

    John D. Salamone

    2018-03-01

    Full Text Available Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson’s disease. Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  7. Cognitive capacity limitations and Need for Cognition differentially predict reward-induced cognitive effort expenditure.

    Science.gov (United States)

    Sandra, Dasha A; Otto, A Ross

    2018-03-01

    While psychological, economic, and neuroscientific accounts of behavior broadly maintain that people minimize expenditure of cognitive effort, empirical work reveals how reward incentives can mobilize increased cognitive effort expenditure. Recent theories posit that the decision to expend effort is governed, in part, by a cost-benefit tradeoff whereby the potential benefits of mental effort can offset the perceived costs of effort exertion. Taking an individual differences approach, the present study examined whether one's executive function capacity, as measured by Stroop interference, predicts the extent to which reward incentives reduce switch costs in a task-switching paradigm, which indexes additional expenditure of cognitive effort. In accordance with the predictions of a cost-benefit account of effort, we found that a low executive function capacity-and, relatedly, a low intrinsic motivation to expend effort (measured by Need for Cognition)-predicted larger increase in cognitive effort expenditure in response to monetary reward incentives, while individuals with greater executive function capacity-and greater intrinsic motivation to expend effort-were less responsive to reward incentives. These findings suggest that an individual's cost-benefit tradeoff is constrained by the perceived costs of exerting cognitive effort. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Educational Outreach Efforts at the NNDC

    International Nuclear Information System (INIS)

    Holden, N.E.

    2014-01-01

    Isotopes and nuclides are important in our everyday life. The general public and most students are never exposed to the concepts of stable and radioactive isotopes/nuclides. The National Nuclear Data Center (NNDC) is involved in an international project to develop a Periodic Table of the Isotopes for the educational community to illustrate the importance of isotopes and nuclides in understanding the world around us. This effort should aid teachers in introducing these concepts to students from the high school to the graduate school level

  9. Effort variation regularization in sound field reproduction

    DEFF Research Database (Denmark)

    Stefanakis, Nick; Jacobsen, Finn; Sarris, Ioannis

    2010-01-01

    In this paper, active control is used in order to reproduce a given sound field in an extended spatial region. A method is proposed which minimizes the reproduction error at a number of control positions with the reproduction sources holding a certain relation within their complex strengths......), and adaptive wave field synthesis (AWFS), both under free-field conditions and in reverberant rooms. It is shown that effort variation regularization overcomes the problems associated with small spaces and with a low ratio of direct to reverberant energy, improving thus the reproduction accuracy...

  10. Multipartite Entanglement Detection with Minimal Effort

    Science.gov (United States)

    Knips, Lukas; Schwemmer, Christian; Klein, Nico; Wieśniak, Marcin; Weinfurter, Harald

    2016-11-01

    Certifying entanglement of a multipartite state is generally considered a demanding task. Since an N qubit state is parametrized by 4N-1 real numbers, one might naively expect that the measurement effort of generic entanglement detection also scales exponentially with N . Here, we introduce a general scheme to construct efficient witnesses requiring a constant number of measurements independent of the number of qubits for states like, e.g., Greenberger-Horne-Zeilinger states, cluster states, and Dicke states. For four qubits, we apply this novel method to experimental realizations of the aforementioned states and prove genuine four-partite entanglement with two measurement settings only.

  11. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. [Food additives and healthiness].

    Science.gov (United States)

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  13. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  14. Adverse health effects of high-effort/low-reward conditions.

    Science.gov (United States)

    Siegrist, J

    1996-01-01

    In addition to the person-environment fit model (J. R. French, R. D. Caplan, & R. V. Harrison, 1982) and the demand-control model (R. A. Karasek & T. Theorell, 1990), a third theoretical concept is proposed to assess adverse health effects of stressful experience at work: the effort-reward imbalance model. The focus of this model is on reciprocity of exchange in occupational life where high-cost/low-gain conditions are considered particularly stressful. Variables measuring low reward in terms of low status control (e.g., lack of promotion prospects, job insecurity) in association with high extrinsic (e.g., work pressure) or intrinsic (personal coping pattern, e.g., high need for control) effort independently predict new cardiovascular events in a prospective study on blue-collar men. Furthermore, these variables partly explain prevalence of cardiovascular risk factors (hypertension, atherogenic lipids) in 2 independent studies. Studying adverse health effects of high-effort/low-reward conditions seems well justified, especially in view of recent developments of the labor market.

  15. Hiding effort to gain a competitive advantage: Evidence from China.

    Science.gov (United States)

    Zhao, Li; Heyman, Gail D

    2018-06-01

    Previous studies with Western populations have shown that adolescents' tendency to downplay their academic effort is affected by two kinds of motives: ability-related motives (e.g., to appear competent) and social approval motives (e.g., to be popular). In this research, we test for the presence of additional competition-related motives in China, a culture placing strong emphasis on academic competition. Study 1 (N = 150) showed that, in response to a scenario in which a hard-working high-school junior hid effort from classmates, the most highly endorsed explanation was "to influence others to work less hard to maintain a competitive advantage." Study 2 (N = 174) revealed that competition-related explanations were endorsed relatively more often when the speaker and audience had similar academic rankings. This tendency was most evident when both speaker and audience were top performers, and when this was the case, participants' desire to demonstrate superiority over others was a positive predictor of endorsement of competition-related motives. Study 3 (N = 137) verified that competition-related motives were more strongly endorsed among Chinese participants than U.S. These results suggest that at least in cultures that emphasize academic competition and in contexts where competition is salient, hiding effort is often about attempting to gain strategic advantage. © 2016 International Union of Psychological Science.

  16. Automated Defect Recognition as a Critical Element of a Three Dimensional X-ray Computed Tomography Imaging-Based Smart Non-Destructive Testing Technique in Additive Manufacturing of Near Net-Shape Parts

    Directory of Open Access Journals (Sweden)

    Istvan Szabo

    2017-11-01

    Full Text Available In this paper, a state of the art automated defect recognition (ADR system is presented that was developed specifically for Non-Destructive Testing (NDT of powder metallurgy (PM parts using three dimensional X-ray Computed Tomography (CT imaging, towards enabling online quality assurance and enhanced integrity confidence. PM parts exhibit typical defects such as microscopic cracks, porosity, and voids, internal to components that without an effective detection system, limit the growth of industrial applications. Compared to typical testing methods (e.g., destructive such as metallography that is based on sampling, cutting, and polishing of parts, CT provides full coverage of defect detection. This paper establishes the importance and advantages of an automated NDT system for the PM industry applications with particular emphasis on image processing procedures for defect recognition. Moreover, the article describes how to establish a reference library based on real 3D X-ray CT images of net-shape parts. The paper follows the development of the ADR system from processing 2D image slices of a measured 3D X-ray image to processing the complete 3D X-ray image as a whole. The introduced technique is successfully integrated into an automated in-line quality control system highly sought by major industry sectors in Oil and Gas, Automotive, and Aerospace.

  17. International Efforts for the Nuclear Security

    International Nuclear Information System (INIS)

    Yoo, Ho Sik; Kwak, Sung Woo; Lee, Ho Jin; Shim, Hye Won; Lee, Jong Uk

    2005-01-01

    Many concerns have been focused on the nuclear security since the 9.11. With increasing the threat related to nuclear material and nuclear facilities, the demand of strengthening the international physical protection system has been raised. Along with this, the international communities are making their efforts to increase nuclear security. The agreement of revising the 'Convention on Physical Protection of Nuclear Materials'(hereafter, CPPNM), which was held in Vienna on the July of 2005, was one of these efforts. U.N is also preparing the 'International Convention for the Suppression of Acts of Nuclear Terrorism' to show its firm resolution against nuclear terror. It is important to understand what measures should be taken to meet the international standard for establishing national physical protection system. To do this, international trend on the physical protection system such as CPPNM and U.N. convention should be followed. This paper explains about the content of the CPPNM and U.N convention. They will be helpful to consolidate the physical protection system in Korea

  18. Sidoarjo mudflow phenomenon and its mitigation efforts

    Science.gov (United States)

    Wibowo, H. T.; Williams, V.

    2009-12-01

    Hot mud first erupted in Siring village, Porong, Sidoarjo May 29th 2006. The mud first appeared approximately 200 meters from Banjarpanji-1 gas-drilling well. The mud volume increased day by day, from 5000 cubic meters per day on June 2006 to 50,000 cubic meters per day during the last of 2006, and then increased to 100,000-120,000 cubic meters per day during 2007. Flow still continues at a high rate. Moreover, as the water content has gone down, the clast content has gone up. Consequently, there is now the threat of large amounts of solid material being erupted throughout the area. Also, there is the issue of subsurface collapse and ground surface subsidence. The Indonesian government has set up a permanent team to support communities affected by the mudflow that has swamped a number of villages near LUSI. Toll roads, railway tracks and factories also have been submerged and over 35,000 people have been displaced to date. The Sidoarjo Mudflow Mitigation Agency [SMMA, BPLS (Indonesia)] replaces a temporary team called National Team PSLS which was installed for seven months and ended their work on 7 April 2007. BPLS was set up by Presidential Regulation No. 14 / 2007, and it will have to cover the costs related to the social impact of the disaster, especially outside the swamped area. BPLS is the central government institution designated to handle the disaster by coordination with both the drilling company and local (provincial and district) governments. It takes a comprehensive, integrated and holistic approach for its mission and challenges. Those are: 1) How to stop the mudflow, 2) How to mitigate the impacts of the mudflow, and 3) How to minimize the social, economic, environmental impacts, and infrastructure impacts. The mudflow mitigation efforts were constrained by dynamic geology conditions, as well as resistance to certain measures by residents of impacted areas. Giant dykes were built to retain the spreading mud, and the mudflow from the main vent was

  19. Distributed computing for global health

    CERN Multimedia

    CERN. Geneva; Schwede, Torsten; Moore, Celia; Smith, Thomas E; Williams, Brian; Grey, François

    2005-01-01

    Distributed computing harnesses the power of thousands of computers within organisations or over the Internet. In order to tackle global health problems, several groups of researchers have begun to use this approach to exceed by far the computing power of a single lab. This event illustrates how companies, research institutes and the general public are contributing their computing power to these efforts, and what impact this may have on a range of world health issues. Grids for neglected diseases Vincent Breton, CNRS/EGEE This talk introduces the topic of distributed computing, explaining the similarities and differences between Grid computing, volunteer computing and supercomputing, and outlines the potential of Grid computing for tackling neglected diseases where there is little economic incentive for private R&D efforts. Recent results on malaria drug design using the Grid infrastructure of the EU-funded EGEE project, which is coordinated by CERN and involves 70 partners in Europe, the US and Russi...

  20. Missed deadline notification in best-effort schedulers

    Science.gov (United States)

    Banachowski, Scott A.; Wu, Joel; Brandt, Scott A.

    2003-12-01

    It is common to run multimedia and other periodic, soft real-time applications on general-purpose computer systems. These systems use best-effort scheduling algorithms that cannot guarantee applications will receive responsive scheduling to meet deadline or timing requirements. We present a simple mechanism called Missed Deadline Notification (MDN) that allows applications to notify the system when they do not receive their desired level of responsiveness. Consisting of a single system call with no arguments, this simple interface allows the operating system to provide better support for soft real-time applications without any a priori information about their timing or resource needs. We implemented MDN in three different schedulers: Linux, BEST, and BeRate. We describe these implementations and their performance when running real-time applications and discuss policies to prevent applications from abusing MDN to gain extra resources.

  1. Current status of the MPEG-4 standardization effort

    Science.gov (United States)

    Anastassiou, Dimitris

    1994-09-01

    The Moving Pictures Experts Group (MPEG) of the International Standardization Organization has initiated a standardization effort, known as MPEG-4, addressing generic audiovisual coding at very low bit-rates (up to 64 kbits/s) with applications in videotelephony, mobile audiovisual communications, video database retrieval, computer games, video over Internet, remote sensing, etc. This paper gives a survey of the status of MPEG-4, including its planned schedule, and initial ideas about requirements and applications. A significant part of this paper is summarizing an incomplete draft version of a `requirements document' which presents specifications of desirable features on the video, audio, and system level of the forthcoming standard. Very low bit-rate coding algorithms are not described, because no endorsement of any particular algorithm, or class of algorithms, has yet been made by MPEG-4, and several seminars held concurrently with MPEG-4 meetings have not so far provided evidence that such high performance coding schemes are achievable.

  2. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  3. Additional value of FDG-PET to contrast enhanced-computed tomography (CT) for the diagnosis of mediastinal lymph node metastasis in non-small cell lung cancer. A Japanese multicenter clinical study

    International Nuclear Information System (INIS)

    Kubota, Kazuo; Murakami, Koji; Inoue, Tomio; Itoh, Harumi; Saga, Tsuneo; Shiomi, Susumu; Hatazawa, Jun

    2011-01-01

    This study was a controlled multicenter clinical study to verify the diagnostic effects of additional fluorodeoxyglucose-positron emission tomography (FDG-PET) to contrast-enhanced CT for mediastinal lymph node metastasis in patients with operable non-small cell lung cancer (NSCLC). NSCLC patients with enlarged mediastinal lymph nodes (short diameter, 7-20 mm), confirmed using contrast-enhanced CT, were examined using FDG-PET to detect metastases prior to surgery. The primary endpoint was the accuracy for concomitantly used CT and FDG-PET showing the additional effects of FDG, compared with CT alone. The secondary endpoints were the clinical impact of FDG-PET on therapeutic decisions and adverse reaction from FDG administration. The images were interpreted by investigators at each institution. Moreover, blinded readings were performed by an image interpretation committee independent of the institutions. The gold standard was the pathological diagnosis determined by surgery or biopsy after PET, and patients in whom a pathological diagnosis was not obtained were excluded from the analysis. Among 99 subjects, the results for 81 subjects eligible for analysis showed that the accuracy improved from 69.1% (56/81) for CT alone to 75.3% (61/81) for CT + PET (p=0.404). These findings contributed to treatment decisions in 63.0% (51/81) of the cases, mainly with regard to the selection of the operative procedure. The results of the image interpretation committee showed that the accuracy improved from 64.2% (52/81) (95% confidence interval (CI) 52.8-74.6) for CT to 75.3% (61/81) (95% CI 64.5-84.2) for CT + PET. The accuracy for 106 mediastinal lymph nodes improved significantly from 62.3% (66/106) (95% CI 52.3-71.5) for CT to 79.2% (84/106) (95% CI 70.3-86.5) for CT + PET (p<0.05). We found that no serious adverse drug reactions appeared in any of the 99 patients who received FDG, except for transient mild outliers in the laboratory data for two patients. The addition of FDG

  4. Medial Orbitofrontal Cortex Mediates Effort-related Responding in Rats.

    Science.gov (United States)

    Münster, Alexandra; Hauber, Wolfgang

    2017-11-17

    The medial orbitofrontal cortex (mOFC) is known to support flexible control of goal-directed behavior. However, limited evidence suggests that the mOFC also mediates the ability of organisms to work with vigor towards a selected goal, a hypothesis that received little consideration to date. Here we show that excitotoxic mOFC lesion increased responding under a progressive ratio (PR) schedule of reinforcement, that is, the highest ratio achieved, and increased the preference for the high effort-high reward option in an effort-related decision-making task, but left intact outcome-selective Pavlovian-instrumental transfer and outcome-specific devaluation. Moreover, pharmacological inhibition of the mOFC increased, while pharmacological stimulation reduced PR responding. In addition, pharmacological mOFC stimulation attenuated methylphenidate-induced increase of PR responding. Intact rats tested for PR responding displayed higher numbers of c-Fos positive mOFC neurons than appropriate controls; however, mOFC neurons projecting to the nucleus accumbens did not show a selective increase in neuronal activation implying that they may not play a major role in regulating PR responding. Collectively, these results suggest that the mOFC plays a major role in mediating effort-related motivational functions. Moreover, our data demonstrate for the first time that the mOFC modulates effort-related effects of psychostimulant drugs. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Additives in yoghurt production

    Directory of Open Access Journals (Sweden)

    Milna Tudor

    2008-02-01

    Full Text Available In yoghurt production, mainly because of sensory characteristics, different types of additives are used. Each group, and also each substance from the same group has different characteristics and properties. For that reason, for improvement of yoghurt sensory characteristics apart from addition selection, the quantity of the additive is very important. The same substance added in optimal amount improves yoghurt sensory attributes, but too small or too big addition can reduce yoghurt sensory attributes. In this paper, characteristics and properties of mostly used additives in yoghurt production are described; skimmed milk powder, whey powder, concentrated whey powder, sugars and artificial sweeteners, fruits, stabilizers, casein powder, inulin and vitamins. Also the impact of each additive on sensory and physical properties of yoghurt, syneresis and viscosity, are described, depending on used amount added in yoghurt production.

  6. Measuring cognitive load: performance, mental effort and simulation task complexity.

    Science.gov (United States)

    Haji, Faizal A; Rojas, David; Childs, Ruth; de Ribaupierre, Sandrine; Dubrowski, Adam

    2015-08-01

    Interest in applying cognitive load theory in health care simulation is growing. This line of inquiry requires measures that are sensitive to changes in cognitive load arising from different instructional designs. Recently, mental effort ratings and secondary task performance have shown promise as measures of cognitive load in health care simulation. We investigate the sensitivity of these measures to predicted differences in intrinsic load arising from variations in task complexity and learner expertise during simulation-based surgical skills training. We randomly assigned 28 novice medical students to simulation training on a simple or complex surgical knot-tying task. Participants completed 13 practice trials, interspersed with computer-based video instruction. On trials 1, 5, 9 and 13, knot-tying performance was assessed using time and movement efficiency measures, and cognitive load was assessed using subjective rating of mental effort (SRME) and simple reaction time (SRT) on a vibrotactile stimulus-monitoring secondary task. Significant improvements in knot-tying performance (F(1.04,24.95)  = 41.1, p cognitive load (F(2.3,58.5)  = 57.7, p load among novices engaged in simulation-based learning. These measures can be used to track cognitive load during skills training. Mental effort ratings are also sensitive to small differences in intrinsic load arising from variations in the physical complexity of a simulation task. The complementary nature of these subjective and objective measures suggests their combined use is advantageous in simulation instructional design research. © 2015 John Wiley & Sons Ltd.

  7. Multiproduct Multiperiod Newsvendor Problem with Dynamic Market Efforts

    Directory of Open Access Journals (Sweden)

    Jianmai Shi

    2016-01-01

    Full Text Available We study a multiperiod multiproduct production planning problem where the production capacity and the marketing effort on demand are both considered. The accumulative impact of marketing effort on demand is captured by the Nerlove and Arrow (N-A advertising model. The problem is formulated as a discrete-time, finite-horizon dynamic optimization problem, which can be viewed as an extension to the classic newsvendor problem by integrating with the N-A model. A Lagrangian relaxation based solution approach is developed to solve the problem, in which the subgradient algorithm is used to find an upper bound of the solution and a feasibility heuristic algorithm is proposed to search for a feasible lower bound. Twelve kinds of instances with different problem size involving up to 50 products and 15 planning periods are randomly generated and used to test the Lagrangian heuristic algorithm. Computational results show that the proposed approach can obtain near optimal solutions for all the instances in very short CPU time, which is less than 90 seconds even for the largest instance.

  8. Effect of social influence on effort-allocation for monetary rewards.

    Science.gov (United States)

    Gilman, Jodi M; Treadway, Michael T; Curran, Max T; Calderon, Vanessa; Evins, A Eden

    2015-01-01

    Though decades of research have shown that people are highly influenced by peers, few studies have directly assessed how the value of social conformity is weighed against other types of costs and benefits. Using an effort-based decision-making paradigm with a novel social influence manipulation, we measured how social influence affected individuals' decisions to allocate effort for monetary rewards during trials with either high or low probability of receiving a reward. We found that information about the effort-allocation of peers modulated participant choices, specifically during conditions of low probability of obtaining a reward. This suggests that peer influence affects effort-based choices to obtain rewards especially under conditions of risk. This study provides evidence that people value social conformity in addition to other costs and benefits when allocating effort, and suggests that neuroeconomic studies that assess trade-offs between effort and reward should consider social environment as a factor that can influence decision-making.

  9. Directed-energy process technology efforts

    Science.gov (United States)

    Alexander, P.

    1985-01-01

    A summary of directed-energy process technology for solar cells was presented. This technology is defined as directing energy or mass to specific areas on solar cells to produce a desired effect in contrast to exposing a cell to a thermal or mass flow environment. Some of these second generation processing techniques are: ion implantation; microwave-enhanced chemical vapor deposition; rapid thermal processing; and the use of lasers for cutting, assisting in metallization, assisting in deposition, and drive-in of liquid dopants. Advantages of directed energy techniques are: surface heating resulting in the bulk of the cell material being cooler and unchanged; better process control yields; better junction profiles, junction depths, and metal sintering; lower energy consumption during processing and smaller factory space requirements. These advantages should result in higher-efficiency cells at lower costs. The results of the numerous contracted efforts were presented as well as the application potentials of these new technologies.

  10. The European fusion nuclear technology effort

    International Nuclear Information System (INIS)

    Darvas, J.

    1989-01-01

    The role of fusion technology in the European fusion development strategy is outlined. The main thrust of the present fusion technology programme is responding to development needs of the Next European Torus. A smaller, but important and growing R and D effort is dealing with problems specific to the Demonstration, or Fusion Power, Reactor. The part of the programme falling under the somewhat arbitrarily defined category of 'fusion nuclear technology' is reviewed and an outlook to future activities is given. The review includes tritium technology, blanket technology and breeder materials development, technology and materials for the protection of the first wall and of other plasma facing components, remote handling technology, and safety and environmental impact studies. A few reflections are offered on the future long-term developments in fusion technology. (orig.)

  11. Peru continues to press privitization efforts

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that Peru has again extended the deadline for bids on a 30 year operating contract for state owned Petromar SA's offshore Block Z-2b. The tender is key to efforts to privatize Petromar, a subsidiary of state oil company Petroleos del Peru. The committee charged with implementing Petromar privatization extended the deadline for bids another 70 days Oct. 30, following a 60 day extension made in September. The latest deadline for bids is Feb. 10, with the contract expected to be awarded Feb. 26. A bid package on Block Z-2b is available from Petroperu's Lima headquarters for $20,000. Petromar operates the former Belco Petroleum Corp. offshore assets Peru's government expropriated in 1985. It currently produces 17,600 b/d, compared with 27,000 b/d at the time of expropriation

  12. The present gravitational wave detection effort

    International Nuclear Information System (INIS)

    Riles, Keith

    2010-01-01

    Gravitational radiation offers a new non-electromagnetic window through which to observe the universe. The LIGO and Virgo Collaborations have completed a first joint data run with unprecedented sensitivities to gravitational waves. Results from searches in the data for a variety of astrophysical sources are presented. A second joint data run with improved detector sensitivities is underway, and soon major upgrades will be carried out to build Advanced LIGO and Advanced Virgo with expected improvements in event rates of more than 1000. In parallel there is a vigorous effort in the radio pulsar community to detect nHz gravitational waves via the timing residuals in an array of pulsars at different locations in the sky.

  13. Superconducting cavities developments efforts at RRCAT

    International Nuclear Information System (INIS)

    Puntambekar, A.; Bagre, M.; Dwivedi, J.; Shrivastava, P.; Mundra, G.; Joshi, S.C.; Potukuchi, P.N.

    2011-01-01

    Superconducting RE cavities are the work-horse for many existing and proposed linear accelerators. Raja Ramanna Centre for Advanced Technology (RRCAT) has initiated a comprehensive R and D program for development of Superconducting RF cavities suitable for high energy accelerator application like SNS and ADS. For the initial phase of technology demonstration several prototype 1.3 GHz single cell-cavities have been developed. The work began with development of prototype single cell cavities in aluminum and copper. This helped in development of cavity manufacturing process, proving various tooling and learning on various mechanical and RF qualification processes. The parts manufacturing was done at RRCAT and Electron beam welding was carried out at Indian industry. These cavities further served during commissioning trials for various cavity processing infrastructure being developed at RRCAT and are also a potential candidate for Niobium thin film deposition R and D. Based on the above experience, few single cell cavities were developed in fine grain niobium. The critical technology of forming and machining of niobium and the intermediate RF qualification were developed at RRCAT. The EB welding of bulk niobium cavities was carried out in collaboration with IUAC, New Delhi at their facility. As a next logical step efforts are now on for development of multicell cavities. The prototype dumbbells and end group made of aluminium, comprising of RF and HOM couplers ports have also been developed, with their LB welding done at Indian industry. In this paper we shall present the development efforts towards manufacturing of 1.3 GHz single cell cavities and their initial processing and qualification. (author)

  14. Self-regulating the effortful "social dos".

    Science.gov (United States)

    Cortes, Kassandra; Kammrath, Lara K; Scholer, Abigail A; Peetz, Johanna

    2014-03-01

    In the current research, we explored differences in the self-regulation of the personal dos (i.e., engaging in active and effortful behaviors that benefit the self) and in the self-regulation of the social dos (engaging in those same effortful behaviors to benefit someone else). In 6 studies, we examined whether the same trait self-control abilities that predict task persistence on personal dos would also predict task persistence on social dos. That is, would the same behavior, such as persisting through a tedious and attentionally demanding task, show different associations with trait self-control when it is framed as benefitting the self versus someone else? In Studies 1-3, we directly compared the personal and social dos and found that trait self-control predicted self-reported and behavioral personal dos but not social dos, even when the behaviors were identical and when the incentives were matched. Instead, trait agreeableness--a trait linked to successful self-regulation within the social domain--predicted the social dos. Trait self-control did not predict the social dos even when task difficulty increased (Study 4), but it did predict the social don'ts, consistent with past research (Studies 5-6). The current studies provide support for the importance of distinguishing different domains of self-regulated behaviors and suggest that social dos can be successfully performed through routes other than traditional self-control abilities. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  16. Additional Security Considerations for Grid Management

    Science.gov (United States)

    Eidson, Thomas M.

    2003-01-01

    The use of Grid computing environments is growing in popularity. A Grid computing environment is primarily a wide area network that encompasses multiple local area networks, where some of the local area networks are managed by different organizations. A Grid computing environment also includes common interfaces for distributed computing software so that the heterogeneous set of machines that make up the Grid can be used more easily. The other key feature of a Grid is that the distributed computing software includes appropriate security technology. The focus of most Grid software is on the security involved with application execution, file transfers, and other remote computing procedures. However, there are other important security issues related to the management of a Grid and the users who use that Grid. This note discusses these additional security issues and makes several suggestions as how they can be managed.

  17. How hearing aids, background noise, and visual cues influence objective listening effort.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2013-09-01

    The purpose of this article was to evaluate factors that influence the listening effort experienced when processing speech for people with hearing loss. Specifically, the change in listening effort resulting from introducing hearing aids, visual cues, and background noise was evaluated. An additional exploratory aim was to investigate the possible relationships between the magnitude of listening effort change and individual listeners' working memory capacity, verbal processing speed, or lipreading skill. Twenty-seven participants with bilateral sensorineural hearing loss were fitted with linear behind-the-ear hearing aids and tested using a dual-task paradigm designed to evaluate listening effort. The primary task was monosyllable word recognition and the secondary task was a visual reaction time task. The test conditions varied by hearing aids (unaided, aided), visual cues (auditory-only, auditory-visual), and background noise (present, absent). For all participants, the signal to noise ratio was set individually so that speech recognition performance in noise was approximately 60% in both the auditory-only and auditory-visual conditions. In addition to measures of listening effort, working memory capacity, verbal processing speed, and lipreading ability were measured using the Automated Operational Span Task, a Lexical Decision Task, and the Revised Shortened Utley Lipreading Test, respectively. In general, the effects measured using the objective measure of listening effort were small (~10 msec). Results indicated that background noise increased listening effort, and hearing aids reduced listening effort, while visual cues did not influence listening effort. With regard to the individual variables, verbal processing speed was negatively correlated with hearing aid benefit for listening effort; faster processors were less likely to derive benefit. Working memory capacity, verbal processing speed, and lipreading ability were related to benefit from visual cues. No

  18. Effort, anhedonia, and function in schizophrenia: reduced effort allocation predicts amotivation and functional impairment.

    Science.gov (United States)

    Barch, Deanna M; Treadway, Michael T; Schoen, Nathan

    2014-05-01

    One of the most debilitating aspects of schizophrenia is an apparent interest in or ability to exert effort for rewards. Such "negative symptoms" may prevent individuals from obtaining potentially beneficial outcomes in educational, occupational, or social domains. In animal models, dopamine abnormalities decrease willingness to work for rewards, implicating dopamine (DA) function as a candidate substrate for negative symptoms given that schizophrenia involves dysregulation of the dopamine system. We used the effort-expenditure for rewards task (EEfRT) to assess the degree to which individuals with schizophrenia were wiling to exert increased effort for either larger magnitude rewards or for rewards that were more probable. Fifty-nine individuals with schizophrenia and 39 demographically similar controls performed the EEfRT task, which involves making choices between "easy" and "hard" tasks to earn potential rewards. Individuals with schizophrenia showed less of an increase in effort allocation as either reward magnitude or probability increased. In controls, the frequency of choosing the hard task in high reward magnitude and probability conditions was negatively correlated with depression severity and anhedonia. In schizophrenia, fewer hard task choices were associated with more severe negative symptoms and worse community and work function as assessed by a caretaker. Consistent with patterns of disrupted dopamine functioning observed in animal models of schizophrenia, these results suggest that 1 mechanism contributing to impaired function and motivational drive in schizophrenia may be a reduced allocation of greater effort for higher magnitude or higher probability rewards.

  19. Characterization of Metal Powders Used for Additive Manufacturing.

    Science.gov (United States)

    Slotwinski, J A; Garboczi, E J; Stutzman, P E; Ferraris, C F; Watson, S S; Peltz, M A

    2014-01-01

    Additive manufacturing (AM) techniques can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process.

  20. Characterization of Metal Powders Used for Additive Manufacturing

    Science.gov (United States)

    Slotwinski, JA; Garboczi, EJ; Stutzman, PE; Ferraris, CF; Watson, SS; Peltz, MA

    2014-01-01

    Additive manufacturing (AM) techniques1 can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process. PMID:26601040

  1. Analysis of dismantling possibility and unloading efforts of fuel assemblies from core of WWER

    International Nuclear Information System (INIS)

    Danilov, V.; Dobrov, V.; Semishkin, V.; Vasilchenko, I.

    2006-01-01

    The computation methods of optimal dismantling sequence of fuel assemblies (FA) from core of WWER after different operating periods and accident conditions are considered. The algorithms of fuel dismantling sequence are constructed both on the basis of analysis of mutual spacer grid overlaps of adjacent fuel assemblies and numerical structure analysis of efforts required for FA removal as FA heaving from the core. Computation results for core dismantling sequence after 3-year operating period and LB LOCA are presented in the paper

  2. 18F-Fluorodeoxyglucose Positron Emission Tomography/Magnetic Resonance in Lymphoma: Comparison With 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography and With the Addition of Magnetic Resonance Diffusion-Weighted Imaging.

    Science.gov (United States)

    Giraudo, Chiara; Raderer, Markus; Karanikas, Georgios; Weber, Michael; Kiesewetter, Barbara; Dolak, Werner; Simonitsch-Klupp, Ingrid; Mayerhoefer, Marius E

    2016-03-01

    The aim of this study was to compare F-fluorodeoxyglucose (FDG) positron emission tomography (PET)/magnetic resonance (MR) (with and without diffusion-weighted imaging [DWI]) to F-FDG PET/computed tomography (CT), with regard to the assessment of nodal and extranodal involvement, in patients with Hodgkin lymphoma and non-Hodgkin lymphoma, without restriction to FDG-avid subytpes. Patients with histologically proven lymphoma were enrolled in this prospective, institutional review board-approved study. After a single F-FDG injection, patients consecutively underwent F-FDG PET[Fraction Slash]CT and F-FDG PET/MR on the same day for staging or restaging. Three sets of images were analyzed separately: F-FDG PET/CT, F-FDG PET/MR without DWI, and F-FDG PET/MR with DWI. Region-based agreement and examination-based sensitivity and specificity were calculated for F-FDG PET/CT, F-FDG PET/MR without DWI, and F-FDG PET/MR DWI. Maximum and mean standardized uptake values (SUVmax, SUVmean) on F-FDG PET/CT and F-FDG PET/MR were compared and correlated with minimum and mean apparent diffusion coefficients (ADCmin, ADCmean). Thirty-four patients with a total of 40 examinations were included. Examination-based sensitivities for F-FDG PET/CT, F-FDG PET/MR, and F-FDG PET/MR DWI were 82.1%, 85.7%, and 100%, respectively; specificities were 100% for all 3 techniques; and accuracies were 87.5%, 90%, and 100%, respectively. F-FDG PET/CT was false negative in 5 of 40 examinations (all with mucosa-associated lymphoid tissue lymphoma), and F-FDG PET/MR (without DWI) was false negative in 4 of 40 examinations. Region-based percentages of agreement were 99% (κ, 0.95) between F-FDG PET/MR DWI and F-FDG PET/CT, 99.2% (κ, 0.96) between F-FDG PET/MR and F-FDG PET/CT, and 99.4% (κ, 0.97) between F-FDG PET/MR DWI and F-FDG PET/MR. There was a strong correlation between F-FDG PET/CT and F-FDG PET/MR for SUVmax (r = 0.83) and SUVmean (r = 0.81) but no significant correlation between ADCmin and SUVmax

  3. Additive and polynomial representations

    CERN Document Server

    Krantz, David H; Suppes, Patrick

    1971-01-01

    Additive and Polynomial Representations deals with major representation theorems in which the qualitative structure is reflected as some polynomial function of one or more numerical functions defined on the basic entities. Examples are additive expressions of a single measure (such as the probability of disjoint events being the sum of their probabilities), and additive expressions of two measures (such as the logarithm of momentum being the sum of log mass and log velocity terms). The book describes the three basic procedures of fundamental measurement as the mathematical pivot, as the utiliz

  4. Counternarcotic Efforts in the Southern Cone: Chile

    Science.gov (United States)

    1990-06-30

    within its scope is drug trafficking. -. The Ministry of Health. The Instituto de Salud Publica (National Institute of Public Heath, part of the...Instituto de Salud Publica ) issued a writen reply to a questionaire. Additional information on health matters was provided by Dr. Roberto Laihacar, psychiatrist of the Military Hospital in Santiago. 37

  5. Food Additives and Hyperkinesis

    Science.gov (United States)

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  6. Groups – Additive Notation

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-06-01

    Full Text Available We translate the articles covering group theory already available in the Mizar Mathematical Library from multiplicative into additive notation. We adapt the works of Wojciech A. Trybulec [41, 42, 43] and Artur Korniłowicz [25].

  7. Groups – Additive Notation

    OpenAIRE

    Coghetto Roland

    2015-01-01

    We translate the articles covering group theory already available in the Mizar Mathematical Library from multiplicative into additive notation. We adapt the works of Wojciech A. Trybulec [41, 42, 43] and Artur Korniłowicz [25].

  8. STAR Infrastructure Database: An effort to know each other

    Energy Technology Data Exchange (ETDEWEB)

    Mora, J.C.; Real, Almudena [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Vesterbacka, Pia; Outola, Iisa [STUK - Radiation and Nuclear Safety Authority (Finland); Barnett, Catherine; Beresford, Nick [Natural Environment Research Council - NERC-CEH (United Kingdom); Bradshaw, Clare [Stockholm University (Sweden); Skipperud, Lindis [Norwegian University of Life Sciences - UMB (Norway); Wilrodt, Christine; Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Vanhoudt, Nathalie [Belgian Nuclear Research Centre SCK-CEN (Belgium); Komperoed, Mari [Norwegian Radiation Protection Authority - NRPA (Norway); Gurriaran, Rodolfo; Gilbin, Rodolphe; Hinton, Thomas [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)

    2014-07-01

    Effort over the last decade to make radioecology stronger and sustainable within Europe crystallized in the creation of the European Radioecology Alliance. The first step for this integrative effort was the establishment of a network of excellence (NoE) under the EU FP7 Strategy for Allied Radioecology (STAR www.star-radioecology.org) project which commenced in 2011. One of the project objectives was to share knowledge of European radioecological capabilities. To help achieve this, a register of these capabilities at each of the STAR laboratories has been created. An Infrastructure Database was designed and programmed using web 2.0 technologies on a 'wiki' platform. Its intended use was to identify what assets were held and where improvements could be made. Information collated includes an inventory of the radioanalytical or conventional equipment and methods, bio-informatics equipment and methods, sample and data archives held, and models and codes used. It also provides a summary of the radioecological expertise of the 170 radio-ecologists at STAR institutes whose knowledge is wide-ranging and encompasses: atmospheric dispersion, dosimetry, ecology, ecotoxicology, environmental radiation protection, environmental surveillance, foodstuffs, terrestrial, freshwater and marine radioecology, modelling, radiobiology and radionuclide analyses, emergency preparedness, education and training, amongst others. In 2013, the EU FP7 Coordination and implementation of a pan-European instrument for radioecology (COMET, www.comet-radioecology.org) project, involving the STAR partners and additionally one Japanese and two Ukrainian research institutes, was initiated. The capabilities of these additional partners will be added to the database in 2014. The aim of the database was to gather information to: - avoid duplication of effort and thereby increase efficiency, - improve synergy and collaboration between the STAR project partners and others involved in

  9. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  10. Slow growth efforts renewed in Iran.

    Science.gov (United States)

    Aghajanian, A

    1992-10-01

    Iran's first population policy was developed under the Shah in 1967. Policymakers brought in with the Islamic Revolution of 1979, however, rejected much of the earlier regime's views on women and childbearing. During the Iran-Iraq war of 1980-88, large population size and rapid growth were seen as advantageous to the war effort. After the war, the government of Iran again began to voice concern about rapid population growth. The pragmatic and proactive approach taken by the government since 1988 may, indeed, accelerate a decline in fertility began in the late 1960s, but stalled in the 1980s. The following are examples of the new governmental attitude: the Iranian government announced March 1992 that it would begin importing Norplant and make it available along with other contraceptives at public clinics; last year, the government announced that the fourth child of a family would not be eligible for food rationing or nutritional supplements and other public child benefits; the Minister of Health in 1991 for the first time publicly encouraged male sterilization; and last fall, Iran conducted a special census of the population five years before the regular decennial census date of 1996. These actions represent dramatic policy changes on population growth and family planning in this country of 60 million, the largest and one of the fastest growing in the Middle East.

  11. NREL Quickens its Tech Transfer Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Lammers, H.

    2012-02-01

    Innovations and 'aha' movements in renewable energy and energy efficiency, while exciting in the lab, only truly live up to their promise once they find a place in homes or business. Late last year President Obama issued a directive to all federal agencies to increase their efforts to transfer technologies to the private sector in order to achieve greater societal and economic impacts of federal research investments. The president's call to action includes efforts to establish technology transfer goals and to measure progress, to engage in efforts to increase the speed of technology transfer and to enhance local and regional innovation partnerships. But, even before the White House began its initiative to restructure the commercialization process, the National Renewable Energy Laboratory had a major effort underway designed to increase the speed and impact of technology transfer activities and had already made sure its innovations had a streamlined path to the private sector. For the last three years, NREL has been actively setting commercialization goals and tracking progress against those goals. For example, NREL sought to triple the number of innovations over a five-year period that began in 2009. Through best practices associated with inventor engagement, education and collaboration, NREL quadrupled the number of innovations in just three years. Similar progress has been made in patenting, licensing transactions, income generation and rewards to inventors. 'NREL is known nationally for our cutting-edge research and companies know to call us when they are ready to collaborate,' William Farris, vice president for commercialization and technology transfer, said. 'Once a team is ready to dive in, they don't want be mired in paperwork. We've worked to make our process for licensing NREL technology faster; it now takes less than 60 days for us to come to an agreement and start work with a company interested in our research

  12. STEM Education Efforts in the Ares Projects

    Science.gov (United States)

    Doreswamy, Rajiv; Armstrong, Robert C.

    2010-01-01

    According to the National Science Foundation, of the more than 4 million first university degrees awarded in science and engineering in 2006, students in China earned about 21%, those in the European Union earned about 19%, and those in the United States earned about 11%. Statistics like these are of great interest to NASA's Ares Projects, which are responsible for building the rockets for the U.S. Constellation Program to send humans beyond low-Earth orbit. Science, technology, engineering, and mathematics students are essential for the long-term sustainability of any space program. Since the Projects creation, the Ares Outreach Team has used a variety of STEM-related media, methods, and materials to engage students, educators, and the general public in Constellation's mission. Like Project Apollo, the nation s exploration destinations and the vehicles used to get there can inspire students to learn more about STEM. Ares has been particularly active in public outreach to schools in Northern Alabama; on the Internet via outreach and grade-specific educational materials; and in more informal social media settings such as YouTube and Facebook. These combined efforts remain integral to America s space program, regardless of its future direction.

  13. Regionally Applied Research Efforts (RARE) Report titled " ...

    Science.gov (United States)

    The traditional methodology for health risk assessment used by the U. S. Environmental Protection Agency (EPA) is based on the use of exposure assumptions (e.g. exposure duration, food ingestion rate, body weight, etc.) that represent the entire American population, either as a central tendency exposure (e.g. average, median) or as a reasonable maximum exposure (e.g. 95% upper confidence limit). Unfortunately, EPA lacked exposure information for assessing health risks for New England regional tribes sustaining a tribal subsistence way of life. As a riverine tribe, the Penobscot culture and traditions are inextricably tied to the Penobscot River watershed. It is through hunting, fishing, trapping, gathering and making baskets, pottery, moccasins, birch-bark canoes and other traditional practices that the Penobscot culture and people are sustained. The Penobscot River receives a variety of pollutant discharges leaving the Penobscot Indian Nation (PIN) questioning the ecological health and water quality of the river and how this may affect the practices that sustain their way of life. The objectives of this Regional Applied Research Effort (RARE) study were to: (1) Develop culturally sensitive methodologies for assessing the potential level of exposure to contaminants that Penobscot Indian Nation tribal members may have from maintaining tribal sustenance practices; (2) Conduct field surveys and laboratory analysis on targeted flora and fauna for chemical expo

  14. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  15. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  16. Involving vendors in continuous quality improvement efforts.

    Science.gov (United States)

    McDevitt, M C

    1995-03-01

    In the hospital environment, vendors supply a wide range of items, from surgical sutures to the latest in high-cost technological equipment. Also, many clinical and support services, such as respiratory therapy, transcription, and computer databanks are now outsourced to commercial vendors. Interaction with such vendors is often less than satisfactory, with prolonged timelines and disruption of an important process that is being computerized. Although hospitals deal with very few vendors in long-term relationships, such as those seen in manufacturing, this should not preclude the formation of a supplier-customer relationship that goes beyond management's interaction with the sales representative in response to a request for proposal. This is especially true when a process improvement team has studied an internal process and defined a key quality characteristic.

  17. Quality-oriented efforts in IPD, - a framework

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    1998-01-01

    It is generally expected that modern quality efforts like TQM and ISO9000 should deliver a sufficient framework for quality efforts in industrial companies. Our findings in Danish industry shows a fragmented picture of islands of efforts and a weak understanding of basic quality concepts between...... designers. The paper propose a framework for quality efforts, illustrated by simple metaphors....

  18. Teachable Agents and the Protege Effect: Increasing the Effort towards Learning

    Science.gov (United States)

    Chase, Catherine C.; Chin, Doris B.; Oppezzo, Marily A.; Schwartz, Daniel L.

    2009-01-01

    Betty's Brain is a computer-based learning environment that capitalizes on the social aspects of learning. In Betty's Brain, students instruct a character called a Teachable Agent (TA) which can reason based on how it is taught. Two studies demonstrate the "protege effect": students make greater effort to learn for their TAs than they do…

  19. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  20. A neuronal model of a global workspace in effortful cognitive tasks.

    Science.gov (United States)

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  1. Effort and Displeasure in People Who Are Hard of Hearing.

    Science.gov (United States)

    Matthen, Mohan

    2016-01-01

    Listening effort helps explain why people who are hard of hearing are prone to fatigue and social withdrawal. However, a one-factor model that cites only effort due to hardness of hearing is insufficient as there are many who lead happy lives despite their disability. This article explores other contributory factors, in particular motivational arousal and pleasure. The theory of rational motivational arousal predicts that some people forego listening comprehension because they believe it to be impossible and hence worth no effort at all. This is problematic. Why should the listening task be rated this way, given the availability of aids that reduce its difficulty? Two additional factors narrow the explanatory gap. First, we separate the listening task from the benefit derived as a consequence. The latter is temporally more distant, and is discounted as a result. The second factor is displeasure attributed to the listening task, which increases listening cost. Many who are hard of hearing enjoy social interaction. In such cases, the actual activity of listening is a benefit, not a cost. These people also reap the benefits of listening, but do not have to balance these against the displeasure of the task. It is suggested that if motivational harmony can be induced by training in somebody who is hard of hearing, then the obstacle to motivational arousal would be removed. This suggests a modified goal for health care professionals. Do not just teach those who are hard of hearing how to use hearing assistance devices. Teach them how to do so with pleasure and enjoyment.

  2. The RETRAN-03 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; McFadden, J.H.; Peterson, C.E.; McClure, J.A.; Gose, G.C.; Jensen, P.J.

    1991-01-01

    The RETRAN-03 code development effort is designed to overcome the major theoretical and practical limitations associated with the RETRAN-02 computer code. The major objectives of the development program are to extend the range of analyses that can be performed with RETRAN, to make the code more dependable and faster running, and to have a more transportable code. The first two objectives are accomplished by developing new models and adding other models to the RETRAN-02 base code. The major model additions for RETRAN-03 are as follows: implicit solution methods for the steady-state and transient forms of the field equations; additional options for the velocity difference equation; a new steady-state initialization option for computer low-power steam generator initial conditions; models for nonequilibrium thermodynamic conditions; and several special-purpose models. The source code and the environmental library for RETRAN-03 are written in standard FORTRAN 77, which allows the last objective to be fulfilled. Some models in RETRAN-02 have been deleted in RETRAN-03. In this paper the changes between RETRAN-02 and RETRAN-03 are reviewed

  3. Comparison of cardiovascular response to combined static-dynamic effort, postprandial dynamic effort and dynamic effort alone in patients with chronic ischemic heart disease

    International Nuclear Information System (INIS)

    Hung, J.; McKillip, J.; Savin, W.; Magder, S.; Kraus, R.; Houston, N.; Goris, M.; Haskell, W.; DeBusk, R.

    1982-01-01

    The cardiovascular responses to combined static-dynamic effort, postprandial dynamic effort and dynamic effort alone were evaluated by upright bicycle ergometry during equilibrium-gated blood pool scintigraphy in 24 men, mean age 59 +/- 8 years, with chronic ischemic heart disease. Combined static-dynamic effort and the postprandial state elicited a peak cardiovascular response similar to that of dynamic effort alone. Heart rate, intraarterial systolic and diastolic pressures, rate-pressure product and ejection fraction were similar for the three test conditions at the onset of ischemia and at peak effort. The prevalence and extent of exercise-induced ischemic left ventricular dysfunction, ST-segment depression, angina pectoris and ventricular ectopic activity were also similar during the three test conditions. Direct and indirect measurements of systolic and diastolic blood pressure were highly correlated. The onset of ischemic ST-segment depression and angina pectoris correlated as strongly with heart rate alone as with the rate-pressure product during all three test conditions. The cardiovascular response to combined static-dynamic effort and to postprandial dynamic effort becomes more similar to that of dynamic effort alone as dynamic effort reaches a symptom limit. If significant ischemic and arrhythmic abnormalities are absent during symptom-limited dynamic exercise testing, they are unlikely to appear during combined static-dynamic or postprandial dynamic effort

  4. Alternative additives; Alternative additiver

    Energy Technology Data Exchange (ETDEWEB)

    2007-08-15

    In this project a number of industrial and agricultural waste products have been characterised and evaluated in terms of alkali-getter performance. The intended use is for biomass-fired power stations aiming at reducing corrosion or slagging related problems. The following products have been obtained, characterised and evaluated: 1) Brewery draff 2) Danish de-gassed manure 3) Paper sludge 4) Moulding sand 5) Spent bleaching earth 6) Anorthosite 7) Sand 8) Clay-sludge. Most of the above alternative additive candidates are deemed unsuitable due to insufficient chemical effect and/or expensive requirements for pre-treatment (such as drying and transportation). 3 products were selected for full-scale testing: de-gassed manure, spent bleaching earth and clay slugde. The full scale tests were undertaken at the biomass-fired power stations in Koege, Slagelse and Ensted. Spent bleaching earth (SBE) and clay sludge were the only tested additive candidates that had a proven ability to react with KCl, to thereby reduce Cl-concentrations in deposits, and reduce the deposit flux to superheater tubes. Their performance was shown to nearly as good as commercial additives. De-gassed manure, however, did not evaluate positively due to inhibiting effects of Ca in the manure. Furthermore, de-gassed manure has a high concentration of heavy metals, which imposes a financial burden with regard to proper disposal of the ash by-products. Clay-sludge is a wet clay slurring, and drying and transportation of this product entails substantial costs. Spent bleaching does not require much pre-treatment and is therefore the most promising alternative additive. On the other hand, bleaching earth contains residual plant oil which means that a range of legislation relating to waste combustion comes into play. Not least a waste combustion fee of 330 DKK/tonne. For all alternative (and commercial) additives disposal costs of the increase ash by-products represents a significant cost. This is

  5. Motor effort alters changes of mind in sensorimotor decision making.

    Directory of Open Access Journals (Sweden)

    Diana Burk

    Full Text Available After committing to an action, a decision-maker can change their mind to revise the action. Such changes of mind can even occur when the stream of information that led to the action is curtailed at movement onset. This is explained by the time delays in sensory processing and motor planning which lead to a component at the end of the sensory stream that can only be processed after initiation. Such post-initiation processing can explain the pattern of changes of mind by asserting an accumulation of additional evidence to a criterion level, termed change-of-mind bound. Here we test the hypothesis that physical effort associated with the movement required to change one's mind affects the level of the change-of-mind bound and the time for post-initiation deliberation. We varied the effort required to change from one choice target to another in a reaching movement by varying the geometry of the choice targets or by applying a force field between the targets. We show that there is a reduction in the frequency of change of mind when the separation of the choice targets would require a larger excursion of the hand from the initial to the opposite choice. The reduction is best explained by an increase in the evidence required for changes of mind and a reduced time period of integration after the initial decision. Thus the criteria to revise an initial choice is sensitive to energetic costs.

  6. Collaborative innovation effort and size in alliances

    DEFF Research Database (Denmark)

    Asikainen, Anna-Leena; Radziwon, Agnieszka

    of organisational and marketing innovations. Additionally, small firms were more likely (than large) to engage into alliances as a part of their strategy. On more general level our data also confirm that factors such as: number of highly educated employees, foreign ownership of a firm and presence of firm......This study presents quantitative investigation of the factors that influence the process of forming strategic alliances with a special focus on the role of innovation strategies and firm’s size in alliance building process. The empirical sample is based on a large scale data from the Community...

  7. Computational methods in metabolic engineering for strain design.

    Science.gov (United States)

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Tailoring hospital marketing efforts to physicians' needs.

    Science.gov (United States)

    Mackay, J M; Lamb, C W

    1988-12-01

    Marketing has become widely recognized as an important component of hospital management (Kotler and Clarke 1987; Ludke, Curry, and Saywell 1983). Physicians are becoming recognized as an important target market that warrants more marketing attention than it has received in the past (Super 1987; Wotruba, Haas, and Hartman 1982). Some experts predict that hospitals will begin focusing more marketing attention on physicians and less on consumers (Super 1986). Much of this attention is likely to take the form of practice management assistance, such as computer-based information system support or consulting services. The survey results reported here are illustrative only of how one hospital addressed the problem of physician need assessment. Other potential target markets include physicians who admit patients only to competitor hospitals and physicians who admit to multiple hospitals. The market might be segmented by individual versus group practice, area of specialization, or possibly even physician practice life cycle stage (Wotruba, Haas, and Hartman 1982). The questions included on the survey and the survey format are likely to be situation-specific. The key is the process, not the procedure. It is important for hospital marketers to recognize that practice management assistance needs will vary among markets (Jensen 1987). Therefore, hospitals must carefully identify their target physician market(s) and survey them about their specific needs before developing and implementing new physician marketing programs. Only then can they be reasonably confident that their marketing programs match their customers' needs.

  9. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  10. A new generation in computing

    International Nuclear Information System (INIS)

    Kahn, R.E.

    1983-01-01

    Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed

  11. Children’s Sleep and Academic Achievement: The Moderating Role of Effortful Control

    Science.gov (United States)

    Diaz, Anjolii; Berger, Rebecca; Valiente, Carlos; Eisenberg, Nancy; VanSchyndel, Sarah; Tao, Chun; Spinrad, Tracy L.; Doane, Leah D.; Thompson, Marilyn S.; Silva, Kassondra M.; Southworth, Jody

    2016-01-01

    Poor sleep is thought to interfere with children’s learning and academic achievement (AA). However, existing research and theory indicate there are factors that may mitigate the academic risk associated with poor sleep. The purpose of this study was to examine the moderating role of children’s effortful control (EC) on the relation between sleep and AA in young children. One hundred and three 4.5- to 7-year-olds (M = 5.98 years, SD = 0.61) wore a wrist-based actigraph for five continuous weekday nights. Teachers and coders reported on children’s EC. EC was also assessed with a computer-based task at school. Additionally, we obtained a standardized measure of children’s AA. There was a positive main effect of sleep efficiency to AA. Several relations between sleep and AA were moderated by EC and examination of the simple slopes indicated that the negative relation between sleep and AA was only significant at low levels of EC. PMID:28255190

  12. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  13. Identification of efforts required for continued safe operation of KANUPP

    International Nuclear Information System (INIS)

    Ghafoor, M.A.; Hashmi, J.A.; Siddiqui, Z.H.

    1991-01-01

    Kanupp, the first commercial CANDU PHWR, rated at 137 MWe, was built on turnkey basis by the Canadian General Electric Company for the Pakistan Atomic Energy Commission, and went operational in October, 1972 near Karachi. It has operated since then with a lifetime average availability factor of 51.5% and capacity factor of 25%. In 1976, Kanupp suffered loss of technical support from its original vendors due to the Canadian embargo on export of nuclear technology. Simultaneously, the world experienced the most explosive development and advancement in electronic and computer technology, accelerating the obsolescence of such equipment and systems installed in Kanupp. Replacement upgrading of obsolete computers, control and instrumentation was thus the first major set of efforts realized as essential f or continued safe operation. On the other hand, Kanupp was able to cope with the normal maintenance of its process, mechanical and electrical equipment till the late 80's. But now many of these components are reaching the end of their useful life, and developing chronic problems due to ageing, which can only be solved by complete replacement. This is much more difficult for custom-made nuclear process equipment, e.g. the reactor internals and the fuelling machine. Public awareness and international concern about nuclear safety have increased significantly since the TMI and Chernobyl events. Corresponding realization of the critical role of human factors and the importance of operational experience feedback, has helped Kanupp by opening international channels of communication, including renewed cooperation on CANDU technology. The safety standards and criteria for CANDU as well as other NPPs have matured and evolved gradually over the past two decades. First Kanupp has to ensure that its present ageing-induced equipment problems are resolved to satisfy the original safety requirements and public risk targets which are still internationally acceptable. But as a policy, we

  14. Identification of efforts required for continued safe operation of KANUPP

    Energy Technology Data Exchange (ETDEWEB)

    Ghafoor, M A; Hashmi, J A; Siddiqui, Z H [Karachi Nuclear Power Plant, Karachi (Pakistan)

    1991-04-01

    Kanupp, the first commercial CANDU PHWR, rated at 137 MWe, was built on turnkey basis by the Canadian General Electric Company for the Pakistan Atomic Energy Commission, and went operational in October, 1972 near Karachi. It has operated since then with a lifetime average availability factor of 51.5% and capacity factor of 25%. In 1976, Kanupp suffered loss of technical support from its original vendors due to the Canadian embargo on export of nuclear technology. Simultaneously, the world experienced the most explosive development and advancement in electronic and computer technology, accelerating the obsolescence of such equipment and systems installed in Kanupp. Replacement upgrading of obsolete computers, control and instrumentation was thus the first major set of efforts realized as essential f or continued safe operation. On the other hand, Kanupp was able to cope with the normal maintenance of its process, mechanical and electrical equipment till the late 80's. But now many of these components are reaching the end of their useful life, and developing chronic problems due to ageing, which can only be solved by complete replacement. This is much more difficult for custom-made nuclear process equipment, e.g. the reactor internals and the fuelling machine. Public awareness and international concern about nuclear safety have increased significantly since the TMI and Chernobyl events. Corresponding realization of the critical role of human factors and the importance of operational experience feedback, has helped Kanupp by opening international channels of communication, including renewed cooperation on CANDU technology. The safety standards and criteria for CANDU as well as other NPPs have matured and evolved gradually over the past two decades. First Kanupp has to ensure that its present ageing-induced equipment problems are resolved to satisfy the original safety requirements and public risk targets which are still internationally acceptable. But as a policy, we

  15. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  16. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-01-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  17. Additive manufacturing of metals

    International Nuclear Information System (INIS)

    Herzog, Dirk; Seyda, Vanessa; Wycisk, Eric; Emmelmann, Claus

    2016-01-01

    Additive Manufacturing (AM), the layer-by layer build-up of parts, has lately become an option for serial production. Today, several metallic materials including the important engineering materials steel, aluminium and titanium may be processed to full dense parts with outstanding properties. In this context, the present overview article describes the complex relationship between AM processes, microstructure and resulting properties for metals. It explains the fundamentals of Laser Beam Melting, Electron Beam Melting and Laser Metal Deposition, and introduces the commercially available materials for the different processes. Thereafter, typical microstructures for additively manufactured steel, aluminium and titanium are presented. Special attention is paid to AM specific grain structures, resulting from the complex thermal cycle and high cooling rates. The properties evolving as a consequence of the microstructure are elaborated under static and dynamic loading. According to these properties, typical applications are presented for the materials and methods for conclusion.

  18. Additive manufactured serialization

    Science.gov (United States)

    Bobbitt, III, John T.

    2017-04-18

    Methods for forming an identifying mark in a structure are described. The method is used in conjunction with an additive manufacturing method and includes the alteration of a process parameter during the manufacturing process. The method can form in a unique identifying mark within or on the surface of a structure that is virtually impossible to be replicated. Methods can provide a high level of confidence that the identifying mark will remain unaltered on the formed structure.

  19. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  20. A Composite Contract for Coordinating a Supply Chain with Price and Effort Dependent Stochastic Demand

    Directory of Open Access Journals (Sweden)

    Yu-Shuang Liu

    2016-01-01

    Full Text Available As the demand is more sensitive to price and sales effort, this paper investigates the issue of channel coordination for a supply chain with one manufacturer and one retailer facing price and effort dependent stochastic demand. A composite contract based on the quantity-restricted returns and target sales rebate can achieve coordination in this setting. Two main problems are addressed: (1 how to coordinate the decentralized supply chain; (2 how to determine the optimal sales effort level, pricing, and inventory decisions under the additive demand case. Numerical examples are presented to verify the effectiveness of combined contract in supply chain coordination and highlight model sensitivities to parametric changes.

  1. MACCS2 development and verification efforts

    International Nuclear Information System (INIS)

    Young, M.; Chanin, D.

    1997-01-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the σ y and σ z plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses

  2. [Trends in the utilization of food additives].

    Science.gov (United States)

    Szűcs, Viktória; Bánáti, Diána

    2013-11-17

    The frequent media reports on food additives weakened consumers' trust in food producers and food control authorities as well. Furthermore, consumers' uncertainty is also raised by the fact that they obtain their information from inadequate, mistrustful sources and, therefore, consumers might avoid the consumption of certain foodstuffs. While food producers may react by replacing artificial components by natural ones, they try to emphasize the favourable characteristics of their products. The authors describe the main trends and efforts related to food additives. On the basis of the overview it can be concluded that - besides taking into consideration consumers' needs - product development and research directions are promising. Food producers' efforts may help to restore consumer confidence and trust and they may help them to have informed choice.

  3. RENEGOTIATION MINING CONTRACT: LEGAL PARADIGM RECONSTRUCTION EFFORTS

    Directory of Open Access Journals (Sweden)

    Marilang -

    2014-07-01

    Full Text Available Renegotiation contract mining is not a priori notion that was born but is driven by the fact that empirical Work Contract (KK and coal mining concessions of the Works Agreement (Cca that are valid for this resulted in profits which are not comparable between countries with investors (domestic and foreign. In addition, Law No. 4 of 2009 about Mineral and Coal Mining (minerba through article 169 have been injected that though the mining contracts during the validity of this, still respected until the end, however, if the implementation of these contracts give rise to distortions for the national interest, then the Government must encourage the investors to do Renegotiation against existing contracts to comply with legislation minerba forever within a period of one year since the enactment of the legislation this minerba. Renegotiation mining contracts that have been approved on the fact of the matter is simply an attempt to reconstruct the ruling paradigm, so with that paradigm shift, both parties can reach the intersection for the benefit of both parties, i.e. the parties proportionately Indonesia suffered no losses on the one hand, and the benefit of the domestic and foreign investors remain in reasonable limits on the other. 

  4. Colombia: crusading efforts bring signs of progress.

    Science.gov (United States)

    Kendall, S

    1989-01-01

    Colombia, like many developing countries, has not committed resources to fight the AIDS problem. They have used the media for condom promotion and other sexually transmitted diseases. There have been 151 deaths caused by AIDS by the end of 1988; 344 cases are known, and 130 additional have tested positive to the virus. Health officials were reluctant to recognize the problem, thinking it was outside their country and that they would not be affected by it. Since then, they have tried to target high risk groups and educate them and assist with testing and counseling. There is a move to make the new drug zidovudine available, but few could afford its high price. The authorities have put transvestite prostitutes in jail and kept them for AIDS testing, but few woman prostitutes have been tested. Up until 1986, only 30% of the Red Cross blood bank supplies were being tested; now 80% are, although it comprises only about 40% of the total supply. Drugs are used heavily, but mostly smoked, in Colombia, yet there is some concern about increased use of needles. The majority of cases in Columbia have been homosexual and bisexual men, but prostitution among men and women is prevalent in large cities such as Bogota. Health officials state that education is the best deterrent, but must be perpetuated so people will be constantly reminded.

  5. Sustainability Characterization for Additive Manufacturing.

    Science.gov (United States)

    Mani, Mahesh; Lyons, Kevin W; Gupta, S K

    2014-01-01

    Additive manufacturing (AM) has the potential to create geometrically complex parts that require a high degree of customization, using less material and producing less waste. Recent studies have shown that AM can be an economically viable option for use by the industry, yet there are some inherent challenges associated with AM for wider acceptance. The lack of standards in AM impedes its use for parts production since industries primarily depend on established standards in processes and material selection to ensure the consistency and quality. Inability to compare AM performance against traditional manufacturing methods can be a barrier for implementing AM processes. AM process sustainability has become a driver due to growing environmental concerns for manufacturing. This has reinforced the importance to understand and characterize AM processes for sustainability. Process characterization for sustainability will help close the gaps for comparing AM performance to traditional manufacturing methods. Based on a literature review, this paper first examines the potential environmental impacts of AM. A methodology for sustainability characterization of AM is then proposed to serve as a resource for the community to benchmark AM processes for sustainability. Next, research perspectives are discussed along with relevant standardization efforts.

  6. Closeout of JOYO-1 Specimen Fabrication Efforts

    International Nuclear Information System (INIS)

    ME Petrichek; JL Bump; RF Luther

    2005-01-01

    Fabrication was well under way for the JOYO biaxial creep and tensile specimens when the NR Space program was canceled. Tubes of FS-85, ASTAR-811C, and T-111 for biaxial creep specimens had been drawn at True Tube (Paso Robles, CA), while tubes of Mo-47.5 Re were being drawn at Rhenium Alloys (Cleveland, OH). The Mo-47.5 Re tubes are now approximately 95% complete. Their fabrication and the quantities produced will be documented at a later date. End cap material for FS-85, ASTAR-811C, and T-111 had been swaged at Pittsburgh Materials Technology, Inc. (PMTI) (Large, PA) and machined at Vangura (Clairton, PA). Cutting of tubes, pickling, annealing, and laser engraving were in process at PMTI. Several biaxial creep specimen sets of FS-85, ASTAR-811C, and T-111 had already been sent to Pacific Northwest National Laboratory (PNNL) for weld development. In addition, tensile specimens of FS-85, ASTAR-811C, T-111, and Mo-47.5 Re had been machined at Kin-Tech (North Huntington, PA). Actual machining of the other specimen types had not been initiated. Flowcharts 1-3 detail the major processing steps each piece of material has experienced. A more detailed description of processing will be provided in a separate document [B-MT(SRME)-51]. Table 1 lists the in-process materials and finished specimens. Also included are current metallurgical condition of these materials and specimens. The available chemical analyses for these alloys at various points in the process are provided in Table 2

  7. BWR zinc addition Sourcebook

    International Nuclear Information System (INIS)

    Garcia, Susan E.; Giannelli, Joseph F.; Jarvis, Alfred J.

    2014-01-01

    Boiling Water Reactors (BWRs) have been injecting zinc into the primary coolant via the reactor feedwater system for over 25 years for the purpose of controlling primary system radiation fields. The BWR zinc injection process has evolved since the initial application at the Hope Creek Nuclear Station in 1986. Key transitions were from the original natural zinc oxide (NZO) to depleted zinc oxide (DZO), and from active zinc injection of a powdered zinc oxide slurry (pumped systems) to passive injection systems (zinc pellet beds). Zinc addition has continued through various chemistry regimes changes, from normal water chemistry (NWC) to hydrogen water chemistry (HWC) and HWC with noble metals (NobleChem™) for mitigation of intergranular stress corrosion cracking (IGSCC) of reactor internals and primary system piping. While past reports published by the Electric Power Research Institute (EPRI) document specific industry experience related to these topics, the Zinc Sourcebook was prepared to consolidate all of the experience gained over the past 25 years. The Zinc Sourcebook will benefit experienced BWR Chemistry, Operations, Radiation Protection and Engineering personnel as well as new people entering the nuclear power industry. While all North American BWRs implement feedwater zinc injection, a number of other BWRs do not inject zinc. This Sourcebook will also be a valuable resource to plants considering the benefits of zinc addition process implementation, and to gain insights on industry experience related to zinc process control and best practices. This paper presents some of the highlights from the Sourcebook. (author)

  8. The effects of savings on reservation wages and search effort

    NARCIS (Netherlands)

    Lammers, M.

    2014-01-01

    This paper discusses the interrelations among wealth, reservation wages and search effort. A theoretical job search model predicts wealth to affect reservation wages positively, and search effort negatively. Subsequently, reduced form equations for reservation wages and search intensity take these

  9. Effort levels of the partners in networked manufacturing

    Science.gov (United States)

    Chai, G. R.; Cai, Z.; Su, Y. N.; Zong, S. L.; Zhai, G. Y.; Jia, J. H.

    2017-08-01

    Compared with traditional manufacturing mode, could networked manufacturing improve effort levels of the partners? What factors will affect effort level of the partners? How to encourage the partners to improve their effort levels? To answer these questions, we introduce network effect coefficient to build effort level model of the partners in networked manufacturing. The results show that (1) with the increase of the network effect in networked manufacturing, the actual effort level can go beyond the ideal level of traditional manufacturing. (2) Profit allocation based on marginal contribution rate would help improve effort levels of the partners in networked manufacturing. (3) The partners in networked manufacturing who wishes to have a larger distribution ratio must make a higher effort level, and enterprises with insufficient effort should be terminated in networked manufacturing.

  10. Perceived effort for motor control and decision-making.

    Directory of Open Access Journals (Sweden)

    Ignasi Cos

    2017-08-01

    Full Text Available How effort is internally quantified and how it influences both movement generation and decisions between potential movements are 2 difficult questions to answer. Physical costs are known to influence motor control and decision-making, yet we lack a general, principled characterization of how the perception of effort operates across tasks and conditions. Morel and colleagues introduce an insightful approach to that end, assessing effort indifference points and presenting a quadratic law between perceived effort and force production.

  11. SU-F-J-219: Predicting Ventilation Change Due to Radiation Therapy: Dependency On Pre-RT Ventilation and Effort Correction

    Energy Technology Data Exchange (ETDEWEB)

    Patton, T; Du, K; Bayouth, J [University of Wisconsin, Madison, WI (United States); Christensen, G; Reinhardt, J [University of Iowa, Iowa City, IA (United States)

    2016-06-15

    Purpose: Ventilation change caused by radiation therapy (RT) can be predicted using four-dimensional computed tomography (4DCT) and image registration. This study tested the dependency of predicted post-RT ventilation on effort correction and pre-RT lung function. Methods: Pre-RT and 3 month post-RT 4DCT images were obtained for 13 patients. The 4DCT images were used to create ventilation maps using a deformable image registration based Jacobian expansion calculation. The post-RT ventilation maps were predicted in four different ways using the dose delivered, pre-RT ventilation, and effort correction. The pre-RT ventilation and effort correction were toggled to determine dependency. The four different predicted ventilation maps were compared to the post-RT ventilation map calculated from image registration to establish the best prediction method. Gamma pass rates were used to compare the different maps with the criteria of 2mm distance-to-agreement and 6% ventilation difference. Paired t-tests of gamma pass rates were used to determine significant differences between the maps. Additional gamma pass rates were calculated using only voxels receiving over 20 Gy. Results: The predicted post-RT ventilation maps were in agreement with the actual post-RT maps in the following percentage of voxels averaged over all subjects: 71% with pre-RT ventilation and effort correction, 69% with no pre-RT ventilation and effort correction, 60% with pre-RT ventilation and no effort correction, and 58% with no pre-RT ventilation and no effort correction. When analyzing only voxels receiving over 20 Gy, the gamma pass rates were respectively 74%, 69%, 65%, and 55%. The prediction including both pre- RT ventilation and effort correction was the only prediction with significant improvement over using no prediction (p<0.02). Conclusion: Post-RT ventilation is best predicted using both pre-RT ventilation and effort correction. This is the only prediction that provided a significant

  12. Efforts to Consolidate Chalcogels with Adsorbed Iodine

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Brian J.; Pierce, David A.; Chun, Jaehun

    2013-08-28

    This document discusses ongoing work with non-oxide aerogels, called chalcogels, that are under development at the Pacific Northwest National Laboratory as sorbents for gaseous iodine. Work was conducted in fiscal year 2012 to demonstrate the feasibility of converting Sn2S3 chalcogel without iodine into a glass. This current document summarizes the work conducted in fiscal year 2013 to assess the consolidation potential of non-oxide aerogels with adsorbed iodine. The Sn2S3 and Sb13.5Sn5S20 chalcogels were selected for study. The first step in the process for these experiments was to load them with iodine (I2). The I2 uptake was ~68 mass% for Sn2S3 and ~50 mass% for Sb13.5Sn5S20 chalcogels. X-ray diffraction (XRD) of both sets of sorbents showed that metal-iodide complexes were formed during adsorption, i.e., SnI4 for Sn2S3 and SbI3 for Sb13.5Sn5S20. Additionally, metal-sulfide-iodide complexes were formed, i.e., SnSI for Sn2S3 and SbSI for Sb13.5Sn5S20. No XRD evidence for unreacted iodine was found in any of these samples. Once the chalcogels had reached maximum adsorption, the consolidation potential was assessed. Here, the sorbents were heated for consolidation in vacuum-sealed quartz vessels. The Sb13.5Sn5S20 chalcogel was heated both (1) in a glassy carbon crucible within a fused quartz tube and (2) in a single-containment fused quartz tube. The Sn2S3 chalcogel was only heated in a single-containment fused quartz tube. In both cases with the single-containment fused quartz experiments, the material consolidated nicely. However, in both cases, there were small fractions of metal iodides not incorporated into the final product as well as fused quartz particles within the melt due to the sample attacking the quartz wall during the heat treatment. The Sb13.5Sn5S20 did not appear to attack the glassy carbon crucible so, for future experiments, it would be ideal to apply a coating, such as pyrolytic graphite, to the inner walls of the fused quartz vessel to prevent

  13. Joint Efforts Towards European HF Radar Integration

    Science.gov (United States)

    Rubio, A.; Mader, J.; Griffa, A.; Mantovani, C.; Corgnati, L.; Novellino, A.; Schulz-Stellenfleth, J.; Quentin, C.; Wyatt, L.; Ruiz, M. I.; Lorente, P.; Hartnett, M.; Gorringe, P.

    2016-12-01

    During the past two years, significant steps have been made in Europe for achieving the needed accessibility to High Frequency Radar (HFR) data for a pan-European use. Since 2015, EuroGOOS Ocean Observing Task Teams (TT), such as HFR TT, are operational networks of observing platforms. The main goal is on the harmonization of systems requirements, systems design, data quality, improvement and proof of the readiness and standardization of HFR data access and tools. Particular attention is being paid by HFR TT to converge from different projects and programs toward those common objectives. First, JERICO-NEXT (Joint European Research Infrastructure network for Coastal Observatory - Novel European eXpertise for coastal observaTories, H2020 2015 Programme) will contribute on describing the status of the European network, on seeking harmonization through exchange of best practices and standardization, on developing and giving access to quality control procedures and new products, and finally on demonstrating the use of such technology in the general scientific strategy focused by the Coastal Observatory. Then, EMODnet (European Marine Observation and Data Network) Physics started to assemble HF radar metadata and data products within Europe in a uniform way. This long term program is providing a combined array of services and functionalities to users for obtaining free of charge data, meta-data and data products on the physical conditions of European sea basins and oceans. Additionally, the Copernicus Marine Environment Monitoring Service (CMEMS) delivers from 2015 a core information service to any user related to 4 areas of benefits: Maritime Safety, Coastal and Marine Environment, Marine Resources, and Weather, Seasonal Forecasting and Climate activities. INCREASE (Innovation and Networking for the integration of Coastal Radars into EuropeAn marine SErvices - CMEMS Service Evolution 2016) will set the necessary developments towards the integration of existing European

  14. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... path. A special computer program processes this large volume of data to create two-dimensional cross-sectional ... time, resulting in more detail and additional view capabilities. Modern CT scanners are so fast that they ...

  15. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... path. A special computer program processes this large volume of data to create two-dimensional cross-sectional ... time, resulting in more detail and additional view capabilities. Modern CT scanners are so fast that they ...

  16. Zero Effort Technologies Considerations, Challenges, and Use in Health, Wellness, and Rehabilitation

    CERN Document Server

    Mihailidis, Alex; Hoey, Jesse

    2011-01-01

    This book introduces zero-effort technologies (ZETs), an emerging class of technology that requires little or no effort from the people who use it. ZETs use advanced techniques, such as computer vision, sensor fusion, decision-making and planning, and machine learning to autonomously operate through the collection, analysis, and application of data about the user and his/her context. This book gives an overview of ZETs, presents concepts in the development of pervasive intelligent technologies and environments for health and rehabilitation, along with an in-depth discussion of the design princ

  17. Goal Setting and Expectancy Theory Predictions of Effort and Performance.

    Science.gov (United States)

    Dossett, Dennis L.; Luce, Helen E.

    Neither expectancy (VIE) theory nor goal setting alone are effective determinants of individual effort and task performance. To test the combined ability of VIE and goal setting to predict effort and performance, 44 real estate agents and their managers completed questionnaires. Quarterly income goals predicted managers' ratings of agents' effort,…

  18. Sewage sludge additive

    Science.gov (United States)

    Kalvinskas, J. J.; Mueller, W. A.; Ingham, J. D. (Inventor)

    1980-01-01

    The additive is for a raw sewage treatment process of the type where settling tanks are used for the purpose of permitting the suspended matter in the raw sewage to be settled as well as to permit adsorption of the dissolved contaminants in the water of the sewage. The sludge, which settles down to the bottom of the settling tank is extracted, pyrolyzed and activated to form activated carbon and ash which is mixed with the sewage prior to its introduction into the settling tank. The sludge does not provide all of the activated carbon and ash required for adequate treatment of the raw sewage. It is necessary to add carbon to the process and instead of expensive commercial carbon, coal is used to provide the carbon supplement.

  19. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  20. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  1. Parallel quantum computing in a single ensemble quantum computer

    International Nuclear Information System (INIS)

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  2. Additive manufacturing: From implants to organs

    African Journals Online (AJOL)

    Additive manufacturing (AM) constructs 3D objects layer by layer under computer control from 3D models. 3D printing is one ... anatomical models for surgery planning, and design and construction ... production of implants, particularly to replace bony structures, is ... Manufactured organs are, however, an elusive goal.

  3. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  4. Chair Talk: Resources to Maximize Administrative Efforts

    Science.gov (United States)

    MacDonald, H.; Chan, M. A.; Bierly, E. W.; Manduca, C. A.; Ormand, C. J.

    2009-12-01

    , dealing with difficult situations, working with alumni). Through the Association for Women Geoscientists, we have offered annual one-hour lunch discussions at AGU and GSA meetings on issues facing women chairs and deans. Focusing on a different topic each year, these discussions include sharing good solutions, problem solving on various case scenarios, and so forth. In addition, the Building Strong Geoscience Departments program has offered workshops on different aspects of building strong geoscience departments, distributed reports, and made a variety of materials that would be useful to geoscience chairs available on their website. These programs and resources should continue and build to provide more continuity within departments and to increase a broader experience base of faculty. One of the greatest resources for chairs is to have personal connections with other chairs (via these programs), who can be called upon for advice, ideas, or general support. The sense of collective community could act in a powerful way to inspire and encourage more innovations and creative solutions to promote stronger departments.

  5. Site Protection Efforts at the AURA Observatory in Chile

    Science.gov (United States)

    Smith, R. Chris; Smith, Malcolm G.; Sanhueza, Pedro

    2015-08-01

    The AURA Observatory (AURA-O) was the first of the major international observatories to be established in northern Chile to exploit the optimal astronomical conditions available there. The site was originally established in 1962 to host the Cerro Tololo Inter-American Observatory (CTIO). It now hosts more than 20 operational telescopes, including some of the leading U.S. and international astronomical facilities in the southern hemisphere, such as the Blanco 4m telescope on Cerro Tololo and the Gemini-South and SOAR telescopes on Cerro Pachón. Construction of the next generation facility, the Large Synoptic Survey Telescope (LSST), has recently begun on Cerro Pachón, while additional smaller telescopes continue to be added to the complement on Cerro Tololo.While the site has become a major platform for international astronomical facilities over the last 50 years, development in the region has led to an ever-increasing threat of light pollution around the site. AURA-O has worked closely with local, regional, and national authorities and institutions (in particular with the Chilean Ministries of Environment and Foreign Relations) in an effort to protect the site so that future generations of telescopes, as well as future generations of Chileans, can benefit from the dark skies in the region. We will summarize our efforts over the past 15 years to highlight the importance of dark sky protection through education and public outreach as well as through more recent promotion of IDA certifications in the region and support for the World Heritage initiatives described by others in this conference.

  6. Military efforts in nanosensors, 3D printing, and imaging detection

    Science.gov (United States)

    Edwards, Eugene; Booth, Janice C.; Roberts, J. Keith; Brantley, Christina L.; Crutcher, Sihon H.; Whitley, Michael; Kranz, Michael; Seif, Mohamed; Ruffin, Paul

    2017-04-01

    A team of researchers and support organizations, affiliated with the Army Aviation and Missile Research, Development, and Engineering Center (AMRDEC), has initiated multidiscipline efforts to develop nano-based structures and components for advanced weaponry, aviation, and autonomous air/ground systems applications. The main objective of this research is to exploit unique phenomena for the development of novel technology to enhance warfighter capabilities and produce precision weaponry. The key technology areas that the authors are exploring include nano-based sensors, analysis of 3D printing constituents, and nano-based components for imaging detection. By integrating nano-based devices, structures, and materials into weaponry, the Army can revolutionize existing (and future) weaponry systems by significantly reducing the size, weight, and cost. The major research thrust areas include the development of carbon nanotube sensors to detect rocket motor off-gassing; the application of current methodologies to assess materials used for 3D printing; and the assessment of components to improve imaging seekers. The status of current activities, associated with these key areas and their implementation into AMRDEC's research, is outlined in this paper. Section #2 outlines output data, graphs, and overall evaluations of carbon nanotube sensors placed on a 16 element chip and exposed to various environmental conditions. Section #3 summarizes the experimental results of testing various materials and resulting components that are supplementary to additive manufacturing/fused deposition modeling (FDM). Section #4 recapitulates a preliminary assessment of the optical and electromechanical components of seekers in an effort to propose components and materials that can work more effectively.

  7. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  8. Additive lattice kirigami.

    Science.gov (United States)

    Castle, Toen; Sussman, Daniel M; Tanis, Michael; Kamien, Randall D

    2016-09-01

    Kirigami uses bending, folding, cutting, and pasting to create complex three-dimensional (3D) structures from a flat sheet. In the case of lattice kirigami, this cutting and rejoining introduces defects into an underlying 2D lattice in the form of points of nonzero Gaussian curvature. A set of simple rules was previously used to generate a wide variety of stepped structures; we now pare back these rules to their minimum. This allows us to describe a set of techniques that unify a wide variety of cut-and-paste actions under the rubric of lattice kirigami, including adding new material and rejoining material across arbitrary cuts in the sheet. We also explore the use of more complex lattices and the different structures that consequently arise. Regardless of the choice of lattice, creating complex structures may require multiple overlapping kirigami cuts, where subsequent cuts are not performed on a locally flat lattice. Our additive kirigami method describes such cuts, providing a simple methodology and a set of techniques to build a huge variety of complex 3D shapes.

  9. Additive Manufactured Superconducting Cavities

    Science.gov (United States)

    Holland, Eric; Rosen, Yaniv; Woolleet, Nathan; Materise, Nicholas; Voisin, Thomas; Wang, Morris; Mireles, Jorge; Carosi, Gianpaolo; Dubois, Jonathan

    Superconducting radio frequency cavities provide an ultra-low dissipative environment, which has enabled fundamental investigations in quantum mechanics, materials properties, and the search for new particles in and beyond the standard model. However, resonator designs are constrained by limitations in conventional machining techniques. For example, current through a seam is a limiting factor in performance for many waveguide cavities. Development of highly reproducible methods for metallic parts through additive manufacturing, referred to colloquially as 3D printing\\x9D, opens the possibility for novel cavity designs which cannot be implemented through conventional methods. We present preliminary investigations of superconducting cavities made through a selective laser melting process, which compacts a granular powder via a high-power laser according to a digitally defined geometry. Initial work suggests that assuming a loss model and numerically optimizing a geometry to minimize dissipation results in modest improvements in device performance. Furthermore, a subset of titanium alloys, particularly, a titanium, aluminum, vanadium alloy (Ti - 6Al - 4V) exhibits properties indicative of a high kinetic inductance material. This work is supported by LDRD 16-SI-004.

  10. Present state of the SOURCES computer code

    International Nuclear Information System (INIS)

    Shores, Erik F.

    2002-01-01

    In various stages of development for over two decades, the SOURCES computer code continues to calculate neutron production rates and spectra from four types of problems: homogeneous media, two-region interfaces, three-region interfaces and that of a monoenergetic alpha particle beam incident on a slab of target material. Graduate work at the University of Missouri - Rolla, in addition to user feedback from a tutorial course, provided the impetus for a variety of code improvements. Recently upgraded to version 4B, initial modifications to SOURCES focused on updates to the 'tape5' decay data library. Shortly thereafter, efforts focused on development of a graphical user interface for the code. This paper documents the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) and describes additional library modifications in more detail. Minor improvements and planned enhancements are discussed.

  11. Preparing Future Secondary Computer Science Educators

    Science.gov (United States)

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  12. Computers in Schools: White Boys Only?

    Science.gov (United States)

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  13. Neurodynamic evaluation of hearing aid features using EEG correlates of listening effort.

    Science.gov (United States)

    Bernarding, Corinna; Strauss, Daniel J; Hannemann, Ronny; Seidler, Harald; Corona-Strauss, Farah I

    2017-06-01

    In this study, we propose a novel estimate of listening effort using electroencephalographic data. This method is a translation of our past findings, gained from the evoked electroencephalographic activity, to the oscillatory EEG activity. To test this technique, electroencephalographic data from experienced hearing aid users with moderate hearing loss were recorded, wearing hearing aids. The investigated hearing aid settings were: a directional microphone combined with a noise reduction algorithm in a medium and a strong setting, the noise reduction setting turned off, and a setting using omnidirectional microphones without any noise reduction. The results suggest that the electroencephalographic estimate of listening effort seems to be a useful tool to map the exerted effort of the participants. In addition, the results indicate that a directional processing mode can reduce the listening effort in multitalker listening situations.

  14. Numerical computations with GPUs

    CERN Document Server

    Kindratenko, Volodymyr

    2014-01-01

    This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to

  15. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  16. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  17. Context-dependent memory decay is evidence of effort minimization in motor learning: a computational study

    OpenAIRE

    Takiyama, Ken

    2015-01-01

    Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this t...

  18. A Review of Computational Spinal Injury Biomechanics Research and Recommendations for Future Efforts

    Science.gov (United States)

    2011-09-01

    Hongo et al. found high compressive and tensile strains at the base of the pedicle of T10, L1, and L4, indicating that the base of the pedicle is the...fracture process using a combined experimental and finite element approach. European Spine Journal 2004, 13, 481–488. 27 35. Hongo , M.; Abe, E.; Shimada

  19. Computational Intelligence in Software Cost Estimation: Evolving Conditional Sets of Effort Value Ranges

    OpenAIRE

    Papatheocharous, Efi; Andreou, Andreas S.

    2008-01-01

    In this approach we aimed at addressing the problem of large variances found in available historical data that are used in software cost estimation. Project data is expensive to collect, manage and maintain. Therefore, if we wish to lower the dependence of the estimation to

  20. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  1. Effort and accuracy during language resource generation: a pronunciation prediction case study

    CSIR Research Space (South Africa)

    Davel, M

    2006-11-01

    Full Text Available pronunciation dictionary as case study. We show that the amount of effort required to validate a 20,000-word pronunciation dictionary can be reduced sub- stantially by employing appropriate computational tools, when compared to both a fully manual validation... and correcting errors found, and finally, manually verifying a further portion of the resource in order to estimate its current accuracy. We apply this general approach to the task of developing pronunciation dictionaries. We demonstrate how the validation...

  2. Solving Ratio-Dependent Predator-Prey System with Constant Effort Harvesting Using Homotopy Perturbation Method

    Directory of Open Access Journals (Sweden)

    Abdoul R. Ghotbi

    2008-01-01

    Full Text Available Due to wide range of interest in use of bioeconomic models to gain insight into the scientific management of renewable resources like fisheries and forestry, homotopy perturbation method is employed to approximate the solution of the ratio-dependent predator-prey system with constant effort prey harvesting. The results are compared with the results obtained by Adomian decomposition method. The results show that, in new model, there are less computations needed in comparison to Adomian decomposition method.

  3. Safety enhancement efforts after Fukushima accident in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Lee, U.C., E-mail: uclee@nssc.go.kr [Nuclear Safety & Security Commission, Seoul (Korea, Republic of)

    2014-07-01

    , 2011. The regulatory function was hence completely separated from the promotion and utilization of nuclear power. Since its establishment, NSSC has continuously been putting in effort to strengthen regulatory practice and system up to international standards. In particular, it has been proceeding with statute revision in relation to obligating severe accident evaluation, enhancing periodic safety evaluation etc and reviewing safety evaluation measure during extreme disaster situation. Additionally, it is revising the bill to expand regulatory scope to include the operator, design, manufacture, supply, qualification test companies throughout the life cycle of NPPs. Furthermore, 'Coordination Committee on Nuclear Safety Policy' (tentative) is to be established for the purpose of supporting and promoting consistency in nuclear safety related policies such as nuclear safety research, accident-failure information, safety of food and medical equipment, radiation in agriculture-livestock-marine product and ground water which are under jurisdiction of different Ministries. One of the most important lessons-learned from the Fukushima accident is communication with the public. NSSC has been emphasizing active and transparent disclosure of information through websites, blogs, SNS etc in order to relieve anxiety and restore public confidence. Other efforts include securing constant communication channel by organizing regional conference to disclose information, discuss issues, and receive feedbacks. Apart from the lessons learned from the Fukushima accident, the Korean Government is expanding its efforts to ensure nuclear safety in other areas such as CFSI issues. It is pushing forward to broaden regulatory scope to include the operator, design, manufacture, supply as well as investigation agencies. As for the management of performance verification agencies, it is to be under direct jurisdiction of the regulatory body. (author)

  4. Discounting the value of safety: effects of perceived risk and effort.

    Science.gov (United States)

    Sigurdsson, Sigurdur O; Taylor, Matthew A; Wirth, Oliver

    2013-09-01

    Although falls from heights remain the most prevalent cause of fatalities in the construction industry, factors impacting safety-related choices associated with work at heights are not completely understood. Better tools are needed to identify and study the factors influencing safety-related choices and decision making. Using a computer-based task within a behavioral economics paradigm, college students were presented a choice between two hypothetical scenarios that differed in working height and effort associated with retrieving and donning a safety harness. Participants were instructed to choose the scenario in which they were more likely to wear the safety harness. Based on choice patterns, switch points were identified, indicating when the perceived risk in both scenarios was equivalent. Switch points were a systematic function of working height and effort, and the quantified relation between perceived risk and effort was described well by a hyperbolic equation. Choice patterns revealed that the perceived risk of working at heights decreased as the effort to retrieve and don a safety harness increased. Results contribute to the development of computer-based procedure for assessing risk discounting within a behavioral economics framework. Such a procedure can be used as a research tool to study factors that influence safety-related decision making with a goal of informing more effective prevention and intervention strategies. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  5. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  7. Characterization of infiltration rates from landfills: supporting groundwater modeling efforts.

    Science.gov (United States)

    Moo-Young, Horace; Johnson, Barnes; Johnson, Ann; Carson, David; Lew, Christine; Liu, Salley; Hancocks, Katherine

    2004-01-01

    The purpose of this paper is to review the literature to characterize infiltration rates from landfill liners to support groundwater modeling efforts. The focus of this investigation was on collecting studies that describe the performance of liners 'as installed' or 'as operated'. This document reviews the state of the science and practice on the infiltration rate through compacted clay liner (CCL) for 149 sites and geosynthetic clay liner (GCL) for 1 site. In addition, it reviews the leakage rate through geomembrane (GM) liners and composite liners for 259 sites. For compacted clay liners (CCL), there was limited information on infiltration rates (i.e., only 9 sites reported infiltration rates.), thus, it was difficult to develop a national distribution. The field hydraulic conductivities for natural clay liners range from 1 x 10(-9) cm s(-1) to 1 x 10(-4) cm s(-1), with an average of 6.5 x 10(-8) cm s(-1). There was limited information on geosynthetic clay liner. For composite lined and geomembrane systems, the leak detection system flow rates were utilized. The average monthly flow rate for composite liners ranged from 0-32 lphd for geomembrane and GCL systems to 0 to 1410 lphd for geomembrane and CCL systems. The increased infiltration for the geomembrane and CCL system may be attributed to consolidation water from the clay.

  8. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  9. Effort-Based Career Opportunities and Working Time

    OpenAIRE

    Bratti, M.; Staffolani, S.

    2005-01-01

    The authors evaluate the economic effects of the hypothesis of effort-based career opportunities, described as a situation in which a firm creates incentives for employees to work longer hours than bargained (or desired), by making career prospects depend on relative working hours. Firms' personnel management policies may tend to increase working time (or workers' effort) in order to maximize profits. Effort-based career opportunities raise working time, production and output per worker, and ...

  10. The Role of Cognitive Effort in Framing Effects

    OpenAIRE

    Krzysztof Przybyszewski; Dorota Rutkowska

    2013-01-01

    Framing effects are a common bias in people making risky decisions. The account for this bias is found in the loss aversion derived from Prospect Theory. Most often in the decision making literature this is the effortful processes that are claimed to reduce framing effects in risky choice tasks i.e. investing of mental effort should de-bias the decision makers. However, in goal framing studies, effortful mental processes may produce those effects. In our experiment participants were primed wi...

  11. Greater effort increases perceived value in an invertebrate.

    Science.gov (United States)

    Czaczkes, Tomer J; Brandstetter, Birgit; di Stefano, Isabella; Heinze, Jürgen

    2018-05-01

    Expending effort is generally considered to be undesirable. However, both humans and vertebrates will work for a reward they could also get for free. Moreover, cues associated with high-effort rewards are preferred to low-effort associated cues. Many explanations for these counterintuitive findings have been suggested, including cognitive dissonance (self-justification) or a greater contrast in state (e.g., energy or frustration level) before and after an effort-linked reward. Here, we test whether effort expenditure also increases perceived value in ants, using both classical cue-association methods and pheromone deposition, which correlates with perceived value. In 2 separate experimental setups, we show that pheromone deposition is higher toward the reward that requires more effort: 47% more pheromone deposition was performed for rewards reached via a vertical runway (high effort) compared with ones reached via a horizontal runway (low effort), and deposition rates were 28% higher on rough (high effort) versus smooth (low effort) runways. Using traditional cue-association methods, 63% of ants trained on different surface roughness, and 70% of ants trained on different runway elevations, preferred the high-effort related cues on a Y maze. Finally, pheromone deposition to feeders requiring memorization of one path bifurcation was up to 29% higher than to an identical feeder requiring no learning. Our results suggest that effort affects value perception in ants. This effect may stem from a cognitive process, which monitors the change in a generalized hedonic state before and after reward. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Evaluation of Advanced Polymers for Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Rios, Orlando [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carter, William G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kutchko, Cindy [PPG Industries, Pittsburgh, PA (United States); Fenn, David [PPG Industries, Pittsburgh, PA (United States); Olson, Kurt [PPG Industries, Pittsburgh, PA (United States)

    2017-09-08

    The goal of this Manufacturing Demonstration Facility (MDF) technical collaboration project between Oak Ridge National Laboratory (ORNL) and PPG Industries, Inc. (PPG) was to evaluate the feasibility of using conventional coatings chemistry and technology to build up material layer-by-layer. The PPG-ORNL study successfully demonstrated that polymeric coatings formulations may overcome many limitations of common thermoplastics used in additive manufacturing (AM), allow lightweight nozzle design for material deposition, and increase build rate. The materials effort focused on layer-by-layer deposition of coatings with each layer fusing together. The combination of materials and deposition results in an additively manufactured build that has sufficient mechanical properties to bear the load of additional layers, yet is capable of bonding across the z-layers to improve build direction strength. The formulation properties were tuned to enable a novel, high-throughput deposition method that is highly scalable, compatible with high loading of reinforcing fillers, and inherently low-cost.

  13. Sharing Information among Various Organizations in Relief Efforts

    National Research Council Canada - National Science Library

    Costur, Gurkan

    2005-01-01

    .... An analysis is presented of the December 2004 Indian Ocean tsunami relief effort; specifically, how different organizations such as the military, United Nations, and non-governmental organizations...

  14. Stochastic evolutionary dynamics in minimum-effort coordination games

    Science.gov (United States)

    Li, Kun; Cong, Rui; Wang, Long

    2016-08-01

    The minimum-effort coordination game draws recently more attention for the fact that human behavior in this social dilemma is often inconsistent with the predictions of classical game theory. Here, we combine evolutionary game theory and coalescence theory to investigate this game in finite populations. Both analytic results and individual-based simulations show that effort costs play a key role in the evolution of contribution levels, which is in good agreement with those observed experimentally. Besides well-mixed populations, set structured populations have also been taken into consideration. Therein we find that large number of sets and moderate migration rate greatly promote effort levels, especially for high effort costs.

  15. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  16. Xenon plasma with caesium as additive

    International Nuclear Information System (INIS)

    Stojilkovic, S.M.; Novakovic, N.V.; Zivkovic, L.M.

    1986-01-01

    The concentration dependence of xenon plasma with cesium as additive in the temperature range of 2000 K to 20,000 K is analyzed. Plasma is considered as weakly nonideal in complete local thermodynamic equilibrium and the interaction between plasma and vessel walls is not taken into account. The values of some of the parameters for nonideality of plasma with 1% of cesium (γ=0.01010) and 10% of cesium (γ=0.11111) are computed, for an initial pressure in plasma of p 0 =13,000 Pa and initial temperature T 0 =1000 K. The ratio of electric conductivity of plasma computed by Lorentz's formula and electric conductivity computed by Spitzer's formula in the same temperature interval is also analyzed. (author) 5 figs., 2 tabs., 16 refs

  17. Xenon plasma with caesium as additive

    Energy Technology Data Exchange (ETDEWEB)

    Stojilkovic, S M; Novakovic, N V; Zivkovic, L M

    1986-01-01

    The concentration dependence of xenon plasma with cesium as additive in the temperature range of 2000 K to 20,000 K is analyzed. Plasma is considered as weakly nonideal in complete local thermodynamic equilibrium and the interaction between plasma and vessel walls is not taken into account. The values of some of the parameters for nonideality of plasma with 1% of cesium (..gamma..=0.01010) and 10% of cesium (..gamma..=0.11111) are computed, for an initial pressure in plasma of p/sub 0/=13,000 Pa and initial temperature T/sub 0/=1000 K. The ratio of electric conductivity of plasma computed by Lorentz's formula and electric conductivity computed by Spitzer's formula in the same temperature interval is also analyzed. (author) 5 figs., 2 tabs., 16 refs.

  18. The Need for Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  19. "I put in effort, therefore I am passionate": Investigating the path from effort to passion in entrepreneurship

    OpenAIRE

    Gielnik, Michael Marcus; Spitzmuller, Matthias; Schmitt, Antje; Klemann, Katharina; Frese, Michael

    2015-01-01

    Most theoretical frameworks in entrepreneurship emphasize that entrepreneurial passion drives entrepreneurial effort. We hypothesize that the reverse effect is also true, and investigate changes in passion as an outcome of effort. Based on theories of self-regulation and self-perception, we hypothesize that making new venture progress and free choice are two factors that help to explain why and under which conditions entrepreneurial effort affects entrepreneurial passion. We undertook two stu...

  20. Effort-Reward Imbalance and Burnout Among ICU Nursing Staff: A Cross-Sectional Study.

    Science.gov (United States)

    Padilla Fortunatti, Cristobal; Palmeiro-Silva, Yasna K

    Occupational stress is commonly observed among staff in intensive care units (ICUs). Sociodemographic, organizational, and job-related factors may lead to burnout among ICU health workers. In addition, these factors could modify the balance between efforts done and rewards perceived by workers; consequently, this imbalance could increase levels of emotional exhaustion and depersonalization and decrease a sense of personal accomplishment. The purpose of this study was to analyze the relationship between effort-reward imbalance and burnout dimensions (emotional exhaustion, depersonalization, and personal accomplishment) among ICU nursing staff in a university hospital in Santiago, Chile. A convenience sample of 36 registered nurses and 46 nurse aides answered the Maslach Burnout Inventory and Effort-Reward Imbalance Questionnaire and provided sociodemographic and work-related data. Age and effort-reward imbalance were significantly associated with emotional exhaustion in both registered nurses and nurse aides; age was negatively correlated with emotional exhaustion, whereas effort-reward imbalance was positively correlated. Age was negatively associated with depersonalization. None of the predictors were associated with personal accomplishment. This study adds valuable information about relationships of sociodemographic factors and effort-reward imbalance and their impact on dimensions of burnout, particularly on emotional exhaustion.

  1. Effect of social influence on effort-allocation for monetary rewards.

    Directory of Open Access Journals (Sweden)

    Jodi M Gilman

    Full Text Available Though decades of research have shown that people are highly influenced by peers, few studies have directly assessed how the value of social conformity is weighed against other types of costs and benefits. Using an effort-based decision-making paradigm with a novel social influence manipulation, we measured how social influence affected individuals' decisions to allocate effort for monetary rewards during trials with either high or low probability of receiving a reward. We found that information about the effort-allocation of peers modulated participant choices, specifically during conditions of low probability of obtaining a reward. This suggests that peer influence affects effort-based choices to obtain rewards especially under conditions of risk. This study provides evidence that people value social conformity in addition to other costs and benefits when allocating effort, and suggests that neuroeconomic studies that assess trade-offs between effort and reward should consider social environment as a factor that can influence decision-making.

  2. Allocating effort and anticipating pleasure in schizophrenia: Relationship with real world functioning.

    Science.gov (United States)

    Serper, M; Payne, E; Dill, C; Portillo, C; Taliercio, J

    2017-10-01

    Poor motivation to engage in goal-oriented behavior has been recognized as a hallmark feature of schizophrenia spectrum disorders (SZ). Low drive in SZ may be related to anticipating rewards as well as to poor working memory. However, few studies to date have examined beliefs about self-efficacy and satisfaction for future rewards (anticipatory pleasure). Additionally, few studies to date have examined how these deficits may impact SZ patients' real world functioning. The present study examined SZ patients' (n=57) anticipatory pleasure, working memory, self-efficacy and real world functioning in relation to their negative symptom severity. Results revealed that SZ patients' negative symptom severity was related to decisions in effort allocation and reward probability, working memory deficits, self-efficacy and anticipatory pleasure for future reward. Effort allocation deficits also predicted patients' daily functioning skills. SZ patients with high levels of negative symptoms are not merely effort averse, but have more difficulty effectively allocating effort and anticipating pleasure engaging in effortful activities. It may be the case that continuously failing to achieve reinforcement from engagement and participation may lead SZ patients to form certain negative beliefs about their abilities which contributes to amotivation and cognitive deficits. Lastly, our findings provide further support for a link between SZ patients functional daily living skills their effort allocation. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  4. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  5. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  6. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  7. Workshop on Computational Optimization

    CERN Document Server

    2015-01-01

    Our everyday life is unthinkable without optimization. We try to minimize our effort and to maximize the achieved profit. Many real world and industrial problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2013. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, resource constrained project scheduling, problems arising in transport services, error correcting codes, optimal system performance and energy consumption and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others.

  8. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  9. Security in Computer Applications

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development. The last part of the lecture covers some miscellaneous issues like the use of cryptography, rules for networking applications, and social engineering threats. This lecture was first given on Thursd...

  10. Best-effort Support for a Virtual Seminar Room

    DEFF Research Database (Denmark)

    Sharp, Robin; Todirica, Edward Alexandru

    2002-01-01

    This paper describes the RTMM Virtual Seminar Room, an interactive distributed multimedia application based on a platform with a simple middleware architecture, using best effort scheduling and a best effort network service. Emphasis has been placed on achieving low latency in all parts...

  11. Development Efforts Of Oil Companies As Perceived By Rural ...

    African Journals Online (AJOL)

    ... that the host communities are highly satisfied with companies' efforts (projects and services) to them. Based on these findings, recommendations were made. Key words: Oil producing communities; oil exploration/production; company's development efforts; Journal of Agriculture and Social Research Vol.4(1) 2004: 60-71 ...

  12. Heuristics Made Easy: An Effort-Reduction Framework

    Science.gov (United States)

    Shah, Anuj K.; Oppenheimer, Daniel M.

    2008-01-01

    In this article, the authors propose a new framework for understanding and studying heuristics. The authors posit that heuristics primarily serve the purpose of reducing the effort associated with a task. As such, the authors propose that heuristics can be classified according to a small set of effort-reduction principles. The authors use this…

  13. 15 CFR 930.114 - Secretarial mediation efforts.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Secretarial mediation efforts. 930.114... MANAGEMENT FEDERAL CONSISTENCY WITH APPROVED COASTAL MANAGEMENT PROGRAMS Secretarial Mediation § 930.114 Secretarial mediation efforts. (a) Following the close of the hearing, the hearing officer shall transmit the...

  14. Seasonal abundance, distribution, and catch per unit effort using gill ...

    African Journals Online (AJOL)

    Catch per unit effort was obtained for the fish of the Sundays .... Methods. Catch per unit effort (numbers and weight/net) of fish in the estuary was obtained from 55 .... Table 1 CPUE (number and mass) of fish caught monthly using gill-net over 12·h periods with 55 nettings at .... The abundance of some other species may.

  15. Quantifying commercial catch and effort of monkfish Lophius ...

    African Journals Online (AJOL)

    Catch-per-unit-effort (cpue) data of vessels targeting monkfish and sole (the two ... analysed using two different methods to construct indices of abundance. ... in Namibia to all tail-weight classes is not appropriate for the current fishery and needs ... Keywords: catch per unit effort, Generalized Linear Model, Lophius vaillanti, ...

  16. Determinants of Tourists Information Search Effort: The Case of ...

    African Journals Online (AJOL)

    The paper examines tourist information search effort prior to the visit to a selected destination. The focus was on identifying the key variables that influence the information search effort of Ghana's international visitors from the United States of America, the United Kingdom and Germany. The Dummy Multiple Regression ...

  17. Policies and Programmatic Efforts Pertaining to Fatherhood: Commentary

    Science.gov (United States)

    Raikes, Helen; Bellotti, Jeanne

    2007-01-01

    The articles in this section focus attention on (1) the historical shift in policies that affect the young men of this nation (2) how fatherhood policies and programmatic efforts are expanding and (3) how fatherhood practices and policies could and perhaps should be expanded and elaborated further. These efforts are linked to a growing body of…

  18. Money Laundering and International Efforts to Fight It

    OpenAIRE

    David Scott

    1996-01-01

    According to one estimate, US$300 billion to US$500 billion in proceeds from serious crime is laundered each year. Left unchecked, money laundering could criminalize the financial system and undermine development efforts in emerging markets. The author reviews efforts by international bodies to fight it.

  19. Polio eradication efforts in regions of geopolitical strife: the Boko ...

    African Journals Online (AJOL)

    Polio eradication efforts in regions of geopolitical strife: the Boko Haram threat to efforts in sub-Saharan Africa. ... Targets of Boko Haram aggression in these zones include violence against polio workers, disruption of polio immunization campaigns, with consequent reduced access to health care and immunization.

  20. Effort reward imbalance, and salivary cortisol in the morning

    DEFF Research Database (Denmark)

    Eller, Nanna Hurwitz; Nielsen, Søren Feodor; Blønd, Morten

    2012-01-01

    Effort reward imbalance (ERI) is suggested to increase risk for stress and is hypothesized to increase cortisol levels, especially the awakening cortisol response, ACR.......Effort reward imbalance (ERI) is suggested to increase risk for stress and is hypothesized to increase cortisol levels, especially the awakening cortisol response, ACR....

  1. Men's Work Efforts and the Transition to Fatherhood.

    Science.gov (United States)

    Astone, Nan Marie; Dariotis, Jacinda; Sonenstein, Freya; Pleck, Joseph H; Hynes, Kathryn

    2010-03-01

    In this paper we tested three hypotheses: (a) the transition to fatherhood is associated with an increase in work effort; (b) the positive association (if any) between the transition to fatherhood and work effort is greater for fathers who are married at the time of the transition; and (c) the association (if any) is greater for men who make the transition at younger ages. The data are from the National Longitudinal Survey of Youth 1979 Cohort. The transition to fatherhood was associated with an increase in work effort among young unmarried men, but not for married men. Among married men who were on-time fathers, work effort decreased. Among childless men, the marriage transition was associated with increased work effort.

  2. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  3. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  4. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  5. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  6. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  7. Quantum Computation--The Ultimate Frontier

    OpenAIRE

    Adami, Chris; Dowling, Jonathan P.

    2002-01-01

    The discovery of an algorithm for factoring which runs in polynomial time on a quantum computer has given rise to a concerted effort to understand the principles, advantages, and limitations of quantum computing. At the same time, many different quantum systems are being explored for their suitability to serve as a physical substrate for the quantum computer of the future. I discuss some of the theoretical foundations of quantum computer science, including algorithms and error correction, and...

  8. The Computer Backgrounds of Soldiers in Army Units: FY01

    National Research Council Canada - National Science Library

    Singh, Harnam

    2002-01-01

    A multi-year research effort was instituted in FY99 to examine soldiers' experiences with computers, self- perceptions of their computer skill, and their ability to identify frequently used, Windows-based icons...

  9. Effect of effort-reward imbalance and burnout on infection control among Ecuadorian nurses.

    Science.gov (United States)

    Colindres, C V; Bryce, E; Coral-Rosero, P; Ramos-Soto, R M; Bonilla, F; Yassi, A

    2018-06-01

    Nurses are frequently exposed to transmissible infections, yet adherence to infection control measures is suboptimal. There has been inadequate research into how the psychosocial work environment affects compliance with infection control measures, especially in low- and middle-income countries. To examine the association between effort-reward imbalance, burnout and adherence to infection control measures among nurses in Ecuador. A cross-sectional study linking psychosocial work environment indicators to infection control adherence. The study was conducted among 333 nurses in four Ecuadorian hospitals. Self-administered questionnaires assessed demographic variables, perceived infection risk, effort-reward imbalance, burnout and infection control adherence. Increased effort-reward imbalance was found to be a unique incremental predictor of exposure to burnout, and burnout was a negative unique incremental predictor of nurses' self-reported adherence with infection control measures. Results suggest an effort-reward imbalance-burnout continuum, which, at higher levels, contributes to reduce adherence to infection control. The Ecuadorean government has made large efforts to improve universal access to health care, yet this study suggests that workplace demands on nurses remain problematic. This study highlights the contribution of effort-reward-imbalance-burnout continuum to the chain of infection by decreased adherence to infection control of nurses. Health authorities should closely monitor the effect of new policies on psychosocial work environment, especially when expanding services and increasing public accessibility with limited resources. Additionally, organizational and psychosocial interventions targeting effort-reward imbalance and burnout in nurses should be considered part of a complete infection prevention and control strategy. Further study is warranted to identify interventions that best ameliorate effort-reward imbalance and burnout in low- and middle

  10. Effort-Based Decision Making: A Novel Approach for Assessing Motivation in Schizophrenia.

    Science.gov (United States)

    Green, Michael F; Horan, William P; Barch, Deanna M; Gold, James M

    2015-09-01

    Because negative symptoms, including motivational deficits, are a critical unmet need in schizophrenia, there are many ongoing efforts to develop new pharmacological and psychosocial interventions for these impairments. A common challenge of these studies involves how to evaluate and select optimal endpoints. Currently, all studies of negative symptoms in schizophrenia depend on ratings from clinician-conducted interviews. Effort-based decision-making tasks may provide a more objective, and perhaps more sensitive, endpoint for trials of motivational negative symptoms. These tasks assess how much effort a person is willing to exert for a given level of reward. This area has been well-studied with animal models of effort and motivation, and effort-based decision-making tasks have been adapted for use in humans. Very recently, several studies have examined physical and cognitive types of effort-based decision-making tasks in cross-sectional studies of schizophrenia, providing evidence for effort-related impairment in this illness. This article covers the theoretical background on effort-based decision-making tasks to provide a context for the subsequent articles in this theme section. In addition, we review the existing literature of studies using these tasks in schizophrenia, consider some practical challenges in adapting them for use in clinical trials in schizophrenia, and discuss interpretive challenges that are central to these types of tasks. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  11. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  12. Innovative Partnerships Assist Community College Computing Programs.

    Science.gov (United States)

    O'Banion, Terry

    1987-01-01

    Relates efforts of major corporations in providing assistance to community college computing programs. Explains the goals of the League for Innovation in the Community College, a consortium of 19 community colleges, and cites examples of collaborative projects. (ML)

  13. Noise Characterization of Devices for Optical Computing

    National Research Council Canada - National Science Library

    Walkup, John

    1998-01-01

    The major objective of the research effort is to investigate the noise characteristics of advanced optical Sources, spatial light modulators, and other devices which are candidates for applications in optical computers...

  14. Clinical impact of (11)C-Pittsburgh compound-B positron emission tomography carried out in addition to magnetic resonance imaging and single-photon emission computed tomography on the diagnosis of Alzheimer's disease in patients with dementia and mild cognitive impairment.

    Science.gov (United States)

    Omachi, Yoshie; Ito, Kimiteru; Arima, Kunimasa; Matsuda, Hiroshi; Nakata, Yasuhiro; Sakata, Masuhiro; Sato, Noriko; Nakagome, Kazuyuki; Motohashi, Nobutaka

    2015-12-01

    The purpose of this study was to evaluate the clinical impact of addition of [(11)C]Pittsburgh compound-B positron emission tomography ((11)C-PiB PET) on routine clinical diagnosis of Alzheimer's disease (AD) dementia and mild cognitive impairment (MCI), and to assess diagnostic agreement between clinical criteria and research criteria of the National Institute on Aging-Alzheimer's Association. The diagnosis in 85 patients was made according to clinical criteria. Imaging examinations, including both magnetic resonance imaging and single-photon emission computed tomography/computed tomography to identify neuronal injury (NI), and (11)C-PiB PET to identify amyloid were performed, and all subjects were re-categorized according to the research criteria. Among 40 patients with probable AD dementia (ProAD), 37 were NI-positive, 29 were (11)C-PiB-positive, and 27 who were both NI- and (11C-PiB-positive were categorized as having 'ProAD dementia with a high level of evidence of the AD pathophysiological process'. Among 20 patients with possible AD dementia (PosAD), 17 were NI-positive, and six who were both NI- and (11)C-PiB-positive were categorized as having 'PosAD with evidence of the AD pathophysiological process'. Among 25 patients with MCI, 18 were NI-positive, 13 were (11)C-PiB-positive, and 10 who were both NI- and (11)C-PiB-positive were categorized as having 'MCI due to AD-high likelihood'. Diagnostic concordance between clinical criteria and research criteria may not be high in this study. (11)C-PiB PET may be of value in making the diagnosis of dementia and MCI in cases with high diagnostic uncertainty. © 2015 The Authors. Psychiatry and Clinical Neurosciences © 2015 Japanese Society of Psychiatry and Neurology.

  15. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  16. How Consumer Trust in Financial Institutions Influences Relationships Between Knowledge, Cognitive Effort and Financial Healthiness

    DEFF Research Database (Denmark)

    Hansen, Torben

    2014-01-01

    Trust not only relates to customer trust in individual financial companies (i.e., narrow-scope trust) but also relates to the broader business context in which consumers carry out their financial decisions (i.e., broad-scope trust). Based on two surveys comprising 1,155 bank consumers and 764...... pension consumers, respectively, the results of this study indicate that broad-scope trust negatively moderates relations between knowledge and financial healthiness and between cognitive effort and financial healthiness. In addition, it is demonstrated that broad-scope trust negatively influences...... cognitive effort and positively influences financial healthiness....

  17. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  18. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  19. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  20. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  1. Efforts to Promote Surakarta and Makassar as Children Friendly Towns

    Directory of Open Access Journals (Sweden)

    Moh Ilham A. Hamudy

    2015-05-01

    Full Text Available This study is about child-friendly city (KLA. This research is motivated by the lack of attention of the local government in protecting children and the issuance of Law No. 35 of 2014 on Protection of Children, which mandates local government obligations in the care of the child. This study sought to describe the various efforts made by the government of Surakarta and Makassar in realizing the KLA, the following supporting factors and obstacles surrounding the KLA embodiment. By using descriptive method and combine it with a qualitative approach, this study found some important points about the efforts of local governments in realizing the KLA. In Surakarta, for example, there have been several child-friendly community health centers (puskesmas. The Puskesmas is equipped with a private lounge complete with a children’s playground. In addition, services for children such as nutrition garden, corner of breast milk, pediatrician, child counseling services and a child abuse victim services also continue to be equipped, and many other programs. No wonder the Ministry of Women Empowerment and Child Protection Republic of Indonesia assessment scoring 713 from a total value of 31 indicators contained in the KLA who had filled the city of Surakarta. Meanwhile, Makassar City has not done a lot of local government programs, because the relatively new Makassar proclaimed KLA and is still central to reform. Among the new programs are being implemented and the Government of Makassar is giving birth certificate free of charge, to build flats in slums, and make the two villages as a pilot project KLA. The factors that affect the embodiment of the KLA it is a commitment. Not only the commitment of the head region, but also all relevant parties. As a cross cutting issue, the KLA also requires institutional capacity. Not only is the capacity of Women Empowerment and Child Protection Agency as a leading sector in the KLA, but also all work units other related

  2. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  3. Introduction to computer data representation

    CERN Document Server

    Fenwick, Peter

    2014-01-01

    Introduction to Computer Data Representation introduces readers to the representation of data within computers. Starting from basic principles of number representation in computers, the book covers the representation of both integer and floating point numbers, and characters or text. It comprehensively explains the main techniques of computer arithmetic and logical manipulation. The book also features chapters covering the less usual topics of basic checksums and 'universal' or variable length representations for integers, with additional coverage of Gray Codes, BCD codes and logarithmic repre

  4. Goal striving strategies and effort mobilization: When implementation intentions reduce effort-related cardiac activity during task performance.

    Science.gov (United States)

    Freydefont, Laure; Gollwitzer, Peter M; Oettingen, Gabriele

    2016-09-01

    Two experiments investigate the influence of goal and implementation intentions on effort mobilization during task performance. Although numerous studies have demonstrated the beneficial effects of setting goals and making plans on performance, the effects of goals and plans on effort-related cardiac activity and especially the cardiac preejection period (PEP) during goal striving have not yet been addressed. According to the Motivational Intensity Theory, participants should increase effort mobilization proportionally to task difficulty as long as success is possible and justified. Forming goals and making plans should allow for reduced effort mobilization when participants perform an easy task. However, when the task is difficult, goals and plans should differ in their effect on effort mobilization. Participants who set goals should disengage, whereas participants who made if-then plans should stay in the field showing high effort mobilization during task performance. As expected, using an easy task in Experiment 1, we observed a lower cardiac PEP in both the implementation intention and the goal intention condition than in the control condition. In Experiment 2, we varied task difficulty and demonstrated that while participants with a mere goal intention disengaged from difficult tasks, participants with an implementation intention increased effort mobilization proportionally with task difficulty. These findings demonstrate the influence of goal striving strategies (i.e., mere goals vs. if-then plans) on effort mobilization during task performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Estimation of total Effort and Effort Elapsed in Each Step of Software Development Using Optimal Bayesian Belief Network

    Directory of Open Access Journals (Sweden)

    Fatemeh Zare Baghiabad

    2017-09-01

    Full Text Available Accuracy in estimating the needed effort for software development caused software effort estimation to be a challenging issue. Beside estimation of total effort, determining the effort elapsed in each software development step is very important because any mistakes in enterprise resource planning can lead to project failure. In this paper, a Bayesian belief network was proposed based on effective components and software development process. In this model, the feedback loops are considered between development steps provided that the return rates are different for each project. Different return rates help us determine the percentages of the elapsed effort in each software development step, distinctively. Moreover, the error measurement resulted from optimized effort estimation and the optimal coefficients to modify the model are sought. The results of the comparison between the proposed model and other models showed that the model has the capability to highly accurately estimate the total effort (with the marginal error of about 0.114 and to estimate the effort elapsed in each software development step.

  6. Work ability, effort-reward imbalance and disability pension claims.

    Science.gov (United States)

    Wienert, J; Spanier, K; Radoschewski, F M; Bethge, M

    2017-12-30

    Effort-reward imbalance (ERI) and self-rated work ability are known independent correlates and predictors of intended disability pension claims. However, little research has focused on the interrelationship between the three and whether self-rated work ability mediates the relationship between ERI and intended disability pension claims. To investigate whether self-rated work ability mediates the association between ERI and intended disability pension claims. Baseline data from participants of the Third German Sociomedical Panel of Employees, a 5-year cohort study that investigates determinants of work ability, rehabilitation utilization and disability pensions in employees who have previously received sickness benefits, were analysed. We tested direct associations between ERI with intended disability pension claims (Model 1) and self-rated work ability (Model 2). Additionally, we tested whether work ability mediates the association between ERI and intended disability pension claims (Model 3). There were 2585 participants. Model 1 indicated a significant association between ERI and intended disability pension claims. Model 2 showed a significant association between ERI and self-rated work ability. The mediation in Model 3 revealed a significant indirect association between ERI and intended disability pension claims via self-rated work ability. There was no significant direct association between ERI and intended disability pension claims. Our results support the adverse health-related impact of ERI on self-rated work ability and intended disability pension claims. © The Author 2017. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. The moderating effects of school climate on bullying prevention efforts.

    Science.gov (United States)

    Low, Sabina; Van Ryzin, Mark

    2014-09-01

    Bullying prevention efforts have yielded mixed effects over the last 20 years. Program effectiveness is driven by a number of factors (e.g., program elements and implementation), but there remains a dearth of understanding regarding the role of school climate on the impact of bullying prevention programs. This gap is surprising, given research suggesting that bullying problems and climate are strongly related. The current study examines the moderating role of school climate on the impacts of a stand-alone bullying prevention curriculum. In addition, the current study examined 2 different dimensions of school climate across both student and staff perceptions. Data for this study were derived from a Steps to Respect (STR) randomized efficacy trial that was conducted in 33 elementary schools over a 1-year period. Schools were randomly assigned to intervention or wait-listed control condition. Outcome measures (pre-to-post) were obtained from (a) all school staff, (b) a randomly selected subset of 3rd-5th grade teachers in each school, and (c) all students in classrooms of selected teachers. Multilevel analyses revealed that psychosocial climate was strongly related to reductions in bullying-related attitudes and behaviors. Intervention status yielded only 1 significant main effect, although, STR schools with positive psychosocial climate at baseline had less victimization at posttest. Policies/administrative commitment to bullying were related to reduced perpetration among all schools. Findings suggest positive psychosocial climate (from both staff and student perspective) plays a foundational role in bullying prevention, and can optimize effects of stand-alone programs. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  8. International efforts to cope with 'brain-drain' issues

    International Nuclear Information System (INIS)

    Boden, D.

    1992-01-01

    Regional arms limitation is realistically a function of the relationships among the four great Pacific Powers and of global disarmament efforts. It reflects the political and security balance among the regional States, many of which are striving to modernize their arm forces. In addition, there are ongoing developments, particularly in Russia and China, which impact on the political-security situation in the Western Pacific. The United States and Japan are also in the process of redefining their relationship, but it is assessed that the United States-Japan security treaty is unlikely to be scrapped, although it may be modified in the light of changing realities. In order to assist the United States in bearing the burdens of maintaining regional security, the other regional States may wish to explore new security architecture, with due regard to regional sensitivities about access and exclusion, through such initiatives as the Australian proposed APEC summit or regional discussions on security such as the First Asia-Pacific Defence Conference, held in Singapore in March 1992. The United Nations could certainly play a useful role, if invited, on such difficult issues as the Spratleys dispute and the Korean issues. Finally, there is a growing awareness that regional security has broadened to include more than just political-military aspects. Of particular importance are the regional economic cooperation programmes organized by ASEAN, ESCAP and APEC. Regional States have become much more aware that military power is not as usable in the post cold war era and that economic development is just as important to overall security. War and conflict over resources may have thus become less important than the search for market access, investments and high technology

  9. International efforts to cope with `brain-drain` issues

    Energy Technology Data Exchange (ETDEWEB)

    Boden, D [Disarmament Affairs, Ministry of Foreign Affairs, Bonn (Germany)

    1993-12-31

    Regional arms limitation is realistically a function of the relationships among the four great Pacific Powers and of global disarmament efforts. It reflects the political and security balance among the regional States, many of which are striving to modernize their arm forces. In addition, there are ongoing developments, particularly in Russia and China, which impact on the political-security situation in the Western Pacific. The United States and Japan are also in the process of redefining their relationship, but it is assessed that the United States-Japan security treaty is unlikely to be scrapped, although it may be modified in the light of changing realities. In order to assist the United States in bearing the burdens of maintaining regional security, the other regional States may wish to explore new security architecture, with due regard to regional sensitivities about access and exclusion, through such initiatives as the Australian proposed APEC summit or regional discussions on security such as the First Asia-Pacific Defence Conference, held in Singapore in March 1992. The United Nations could certainly play a useful role, if invited, on such difficult issues as the Spratleys dispute and the Korean issues. Finally, there is a growing awareness that regional security has broadened to include more than just political-military aspects. Of particular importance are the regional economic cooperation programmes organized by ASEAN, ESCAP and APEC. Regional States have become much more aware that military power is not as usable in the post cold war era and that economic development is just as important to overall security. War and conflict over resources may have thus become less important than the search for market access, investments and high technology

  10. Reminder: Mandatory Computer Security Course

    CERN Multimedia

    IT Department

    2011-01-01

    Just like any other organization, CERN is permanently under attack – even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Therefore, a new dedicated basic computer security course has been designed informing you about the “Do’s” and “Dont’s” when using CERN's computing facilities. This course is mandatory for all person owning a CERN computer account and must be followed once every three years. Users who have never done the course, or whose course needs to be renewe...

  11. New Mandatory Computer Security Course

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Just like any other organization, CERN is permanently under attack - even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Since 2007, newcomers have to follow a dedicated basic computer security course informing them about the “Do’s” and “Dont’s” when using CERNs computing facilities. This course has recently been redesigned. It is now mandatory for all CERN members (users and staff) owning a CERN computer account and must be followed once every three years. Members who...

  12. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  13. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  14. Cloud Computing Governance Lifecycle

    OpenAIRE

    Soňa Karkošková; George Feuerlicht

    2016-01-01

    Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is uncle...

  15. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  16. Dissociating variability and effort as determinants of coordination.

    Directory of Open Access Journals (Sweden)

    Ian O'Sullivan

    2009-04-01

    Full Text Available When coordinating movements, the nervous system often has to decide how to distribute work across a number of redundant effectors. Here, we show that humans solve this problem by trying to minimize both the variability of motor output and the effort involved. In previous studies that investigated the temporal shape of movements, these two selective pressures, despite having very different theoretical implications, could not be distinguished; because noise in the motor system increases with the motor commands, minimization of effort or variability leads to very similar predictions. When multiple effectors with different noise and effort characteristics have to be combined, however, these two cost terms can be dissociated. Here, we measure the importance of variability and effort in coordination by studying how humans share force production between two fingers. To capture variability, we identified the coefficient of variation of the index and little fingers. For effort, we used the sum of squared forces and the sum of squared forces normalized by the maximum strength of each effector. These terms were then used to predict the optimal force distribution for a task in which participants had to produce a target total force of 4-16 N, by pressing onto two isometric transducers using different combinations of fingers. By comparing the predicted distribution across fingers to the actual distribution chosen by participants, we were able to estimate the relative importance of variability and effort of 1:7, with the unnormalized effort being most important. Our results indicate that the nervous system uses multi-effector redundancy to minimize both the variability of the produced output and effort, although effort costs clearly outweighed variability costs.

  17. What makes a reach movement effortful? Physical effort discounting supports common minimization principles in decision making and motor control.

    Directory of Open Access Journals (Sweden)

    Pierre Morel

    2017-06-01

    Full Text Available When deciding between alternative options, a rational agent chooses on the basis of the desirability of each outcome, including associated costs. As different options typically result in different actions, the effort associated with each action is an essential cost parameter. How do humans discount physical effort when deciding between movements? We used an action-selection task to characterize how subjective effort depends on the parameters of arm transport movements and controlled for potential confounding factors such as delay discounting and performance. First, by repeatedly asking subjects to choose between 2 arm movements of different amplitudes or durations, performed against different levels of force, we identified parameter combinations that subjects experienced as identical in effort (isoeffort curves. Movements with a long duration were judged more effortful than short-duration movements against the same force, while movement amplitudes did not influence effort. Biomechanics of the movements also affected effort, as movements towards the body midline were preferred to movements away from it. Second, by introducing movement repetitions, we further determined that the cost function for choosing between effortful movements had a quadratic relationship with force, while choices were made on the basis of the logarithm of these costs. Our results show that effort-based action selection during reaching cannot easily be explained by metabolic costs. Instead, force-loaded reaches, a widely occurring natural behavior, imposed an effort cost for decision making similar to cost functions in motor control. Our results thereby support the idea that motor control and economic choice are governed by partly overlapping optimization principles.

  18. Dopamine does double duty in motivating cognitive effort

    Science.gov (United States)

    Westbrook, Andrew; Braver, Todd S.

    2015-01-01

    Cognitive control is subjectively costly, suggesting that engagement is modulated in relationship to incentive state. Dopamine appears to play key roles. In particular, dopamine may mediate cognitive effort by two broad classes of functions: 1) modulating the functional parameters of working memory circuits subserving effortful cognition, and 2) mediating value-learning and decision-making about effortful cognitive action. Here we tie together these two lines of research, proposing how dopamine serves “double duty”, translating incentive information into cognitive motivation. PMID:26889810

  19. Worldwide collaborative efforts in plasma control software development

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Walker, M.L.; Humphreys, D.A.; Leuer, J.A.; Piglowski, D.A.; Johnson, R.D.; Xiao, B.J.; Hahn, S.H.; Gates, D.A.

    2008-01-01

    This presentation will describe the DIII-D collaborations with various tokamak experiments throughout the world which have adapted custom versions of the DIII-D plasma control system (PCS) software for their own use. Originally developed by General Atomics for use on the DIII-D tokamak, the PCS has been successfully installed and used for the NSTX experiment in Princeton, the MAST experiment in Culham UK, the EAST experiment in China, and the Pegasus experiment in the University of Wisconsin. In addition to these sites, a version of the PCS is currently being developed for use by the KSTAR tokamak in Korea. A well-defined and robust PCS software infrastructure has been developed to provide a common foundation for implementing the real-time data acquisition and feedback control codes. The PCS infrastructure provides a flexible framework that has allowed the PCS to be easily adapted to fulfill the unique needs of each site. The software has also demonstrated great flexibility in allowing for different computing, data acquisition and real-time networking hardware to be used. A description of the current PCS software architecture will be given along with experiences in developing and supporting the various PCS installations throughout the world

  20. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  1. Efficient computation of hashes

    International Nuclear Information System (INIS)

    Lopes, Raul H C; Franqueira, Virginia N L; Hobson, Peter R

    2014-01-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  2. [Effort-reward imbalance at work and depression: current research evidence].

    Science.gov (United States)

    Siegrist, J

    2013-01-01

    In view of highly prevalent stressful conditions in modern working life, in particular increasing work pressure and job insecurity, it is of interest to know whether specific constellations of an adverse psychosocial work environment increase the risk of depressive disorder among employed people. This contribution gives a short overview of current research evidence based on an internationally established work stress model of effort-reward imbalance. Taken together, results from seven prospective epidemiological investigations demonstrate a two-fold elevated relative risk of incident depressive disorder over a mean observation period of 2.7 years among exposed versus non-exposed employees. Additional findings from experimental and quasi-experimental studies point to robust associations of effort-reward imbalance at work with proinflammatory cytokines and markers of reduced immune competence. These latter markers may indicate potential psychobiological pathways. In conclusion, incorporating this new knowledge into medical treatment and preventive efforts seems well justified.

  3. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  4. Computational Modeling for Enhancing Soft Tissue Image Guided Surgery: An Application in Neurosurgery.

    Science.gov (United States)

    Miga, Michael I

    2016-01-01

    With the recent advances in computing, the opportunities to translate computational models to more integrated roles in patient treatment are expanding at an exciting rate. One area of considerable development has been directed towards correcting soft tissue deformation within image guided neurosurgery applications. This review captures the efforts that have been undertaken towards enhancing neuronavigation by the integration of soft tissue biomechanical models, imaging and sensing technologies, and algorithmic developments. In addition, the review speaks to the evolving role of modeling frameworks within surgery and concludes with some future directions beyond neurosurgical applications.

  5. effects of sulphur addition on addition on and mechanical properties

    African Journals Online (AJOL)

    User

    234-8034714355. 8034714355. 1. EFFECTS OF SULPHUR ADDITION ON. ADDITION ON. 2. AND MECHANICAL PROPERTIES O. 3. 4. C. W. Onyia. 5. 1DEPT. OF METALLURGICAL AND MATERIALS. 6. 2, 4DEPT. OF METALLURGICAL ...

  6. Students' Academic Performance: Academic Effort Is an Intervening ...

    African Journals Online (AJOL)

    PROMOTING ACCESS TO AFRICAN RESEARCH ... Students' Academic Performance: Academic Effort Is an Intervening Variable ... This study was designed to seek explanations for differences in academic performance among junior ...

  7. A critical evalluation of internal revenue generating efforts of some ...

    African Journals Online (AJOL)

    Their bad financial state has robbed our rural areas of ... electricity, food production, staff welfare and general condition of living in the rural areas. This ugly ... Inspite of these efforts by successive governments, the problem continues to persist.

  8. Reviewing efforts in global forest conservation for sustainable forest ...

    African Journals Online (AJOL)

    Reviewing efforts in global forest conservation for sustainable forest management: The World Wide Fund (WWF) case study. ... Global Journal of Pure and Applied Sciences. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current ...

  9. NUCLEAR NONPROLIFERATION: U.S. Efforts to Combat Nuclear Smuggling

    National Research Council Canada - National Science Library

    2002-01-01

    ...) information about efforts to combat nuclear smuggling at U.S. borders. My statement today is based on the results of our May 16, 2002, report on this subject1 and information we obtained from the U.S...

  10. AMRDEC's HWIL Synthetic Environment Development Efforts for LADAR Sensors

    National Research Council Canada - National Science Library

    Kim, Hajin J; Cornell, Michael C; Naumann, Charles B

    2004-01-01

    .... With the emerging sensor/electronics technology LADAR sensors are becoming more viable option as an integral part of weapon systems, and AMCOM has been expending efforts to develop the capabilities...

  11. Cognitive dissonance in children: justification of effort or contrast?

    Science.gov (United States)

    Alessandri, Jérôme; Darcheville, Jean-Claude; Zentall, Thomas R

    2008-06-01

    Justification of effort is a form of cognitive dissonance in which the subjective value of an outcome is directly related to the effort that went into obtaining it. However, it is likely that in social contexts (such as the requirements for joining a group) an inference can be made (perhaps incorrectly) that an outcome that requires greater effort to obtain in fact has greater value. Here we present evidence that a cognitive dissonance effect can be found in children under conditions that offer better control for the social value of the outcome. This effect is quite similar to contrast effects that recently have been studied in animals. We suggest that contrast between the effort required to obtain the outcome and the outcome itself provides a more parsimonious account of this phenomenon and perhaps other related cognitive dissonance phenomena as well. Research will be needed to identify cognitive dissonance processes that are different from contrast effects of this kind.

  12. Students Collaborating to Undertake Tracking Efforts for Sturgeon(SCUTES)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Students Collaborating to Undertake Tracking Efforts for Sturgeon (SCUTES) is a collaboration between NOAA Fisheries, sturgeon researchers, and teachers/educators in...

  13. A technique for estimating maximum harvesting effort in a stochastic ...

    Indian Academy of Sciences (India)

    Unknown

    Estimation of maximum harvesting effort has a great impact on the ... fluctuating environment has been developed in a two-species competitive system, which shows that under realistic .... The existence and local stability properties of the equi-.

  14. Book Promotion Efforts in Select Nigerian Newspapers Okere ...

    African Journals Online (AJOL)

    Mrs Afam

    them make informed purchase decision. Hitherto, the ... for product promotion compared to the efforts of manufacturers of consumer goods and other .... The extent of promotion done by a publisher affects greatly the rate of order placed.

  15. Effort-Based Decision-Making in Schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Moran, Erin K; Barch, Deanna M

    2018-08-01

    Motivational impairment has long been associated with schizophrenia but the underlying mechanisms are not clearly understood. Recently, a small but growing literature has suggested that aberrant effort-based decision-making may be a potential contributory mechanism for motivational impairments in psychosis. Specifically, multiple reports have consistently demonstrated that individuals with schizophrenia are less willing than healthy controls to expend effort to obtain rewards. Further, this effort-based decision-making deficit has been shown to correlate with severity of negative symptoms and level of functioning, in many but not all studies. In the current review, we summarize this literature and discuss several factors that may underlie aberrant effort-based decision-making in schizophrenia.

  16. Muscle strength, working capacity and effort in patients with fibromyalgia

    DEFF Research Database (Denmark)

    Nørregaard, J; Bülow, P M; Lykkegaard, J J

    1997-01-01

    exercise capacity, work status and psychometric scoring (SCL-90-R) were correlated. The fibromyalgia patients exhibited significant reduction in voluntary muscle strength of the knee and elbow, flexors and extensors in the order of 20-30%. However, the coefficient of variation was higher among patients......, thus indicating lower effort. The physical performance during an ergometer test corresponded to a maximal oxygen consumption of 21 ml/kg-1 x min-1. The maximal increase in heart rate was only 63% (44-90%) of the predicted increase. Degree of effort or physical capacity did not correlate to psychometric...... scores. Work status was related to psychometric scoring, but not to physical capacity or effort. In conclusion, we found a low degree of effort but near normal physical capacity in the fibromyalgia patients....

  17. Does preoperative measurement of cerebral blood flow with acetazolamide challenge in addition to preoperative measurement of cerebral blood flow at the resting state increase the predictive accuracy of development of cerebral hyperperfusion after carotid endarterectomy? Results from 500 cases with brain perfusion single-photon emission computed tomography study.

    Science.gov (United States)

    Oshida, Sotaro; Ogasawara, Kuniaki; Saura, Hiroaki; Yoshida, Koji; Fujiwara, Shunro; Kojima, Daigo; Kobayashi, Masakazu; Yoshida, Kenji; Kubo, Yoshitaka; Ogawa, Akira

    2015-01-01

    The purpose of the present study was to determine whether preoperative measurement of cerebral blood flow (CBF) with acetazolamide in addition to preoperative measurement of CBF at the resting state increases the predictive accuracy of development of cerebral hyperperfusion after carotid endarterectomy (CEA). CBF at the resting state and cerebrovascular reactivity (CVR) to acetazolamide were quantitatively assessed using N-isopropyl-p-[(123)I]-iodoamphetamine (IMP)-autoradiography method with single-photon emission computed tomography (SPECT) before CEA in 500 patients with ipsilateral internal carotid artery stenosis (≥ 70%). CBF measurement using (123)I-IMP SPECT was also performed immediately and 3 days after CEA. A region of interest (ROI) was automatically placed in the middle cerebral artery territory in the affected cerebral hemisphere using a three-dimensional stereotactic ROI template. Preoperative decreases in CBF at the resting state [95% confidence intervals (CIs), 0.855 to 0.967; P = 0.0023] and preoperative decreases in CVR to acetazolamide (95% CIs, 0.844 to 0.912; P state (difference between areas, 0.173; P state (P state increases the predictive accuracy of the development of post-CEA hyperperfusion.

  18. Self-managed working time and employee effort: Microeconometric evidence

    OpenAIRE

    Beckmann, Michael; Cornelissen, Thomas

    2014-01-01

    Based on German individual-level panel data, this paper empirically examines the impact of self-managed working time (SMWT) on employee effort. Theoretically, workers may respond positively or negatively to having control over their own working hours, depending on whether SMWT increases work morale, induces reciprocal work intensification, or encourages employee shirking. We find that SMWT employees exert higher effort levels than employees with fixed working hours, but after accounting for o...

  19. VIOLENCE IN MARKETING COMMUNICATION EFFORTS OF FASHION BRANDS: SHOCKVERTISING

    OpenAIRE

    BAYAZIT, Zeynep; PANAYIRCI, Uğur Cevdet

    2016-01-01

    Contemporary social and technological changes inevitably affect consumer behaviour. Today’s customer is savvy, have no time and hard to persuade. This new relationship between customers and brands has a deeper impact on competitive industries such as fashion. Fashion brands are eager to adopt shocking themes for their marketing communication efforts in order to emotionally affect and challenge consumers. Aim of this study is to study with a critical perspective the advertisement efforts of fa...

  20. Communication Breakdown: Unraveling the Islamic States Media Efforts

    Science.gov (United States)

    2016-10-01

    Communication Breakdown: Unraveling the Islamic State’s Media Efforts Daniel Milton Communication Breakdown: Unraveling the Islamic State’s Media ...production arm of central media office).28 The high level of communication between the central media office and the satellite offices illustrates the tension...and discussed by the mass media . Those products are likely important to the group’s recruitment efforts, but clearly it is trying to portray itself

  1. Shell Inspection History and Current CMM Inspection Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Montano, Joshua Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-26

    The following report provides a review of past and current CMM Shell Inspection efforts. Calibration of the Sheffield rotary contour gauge has expired and the primary inspector, Matthew Naranjo, has retired. Efforts within the Inspection team are transitioning from maintaining and training new inspectors on Sheffield to off-the-shelf CMM technology. Although inspection of a shell has many requirements, the scope of the data presented in this report focuses on the inner contour, outer contour, radial wall thickness and mass comparisons.

  2. Motivation and effort in individuals with social anhedonia.

    Science.gov (United States)

    McCarthy, Julie M; Treadway, Michael T; Blanchard, Jack J

    2015-06-01

    It has been proposed that anhedonia may, in part, reflect difficulties in reward processing and effortful decision making. The current study aimed to replicate previous findings of effortful decision making deficits associated with elevated anhedonia and expand upon these findings by investigating whether these decision making deficits are specific to elevated social anhedonia or are also associated with elevated positive schizotypy characteristics. The current study compared controls (n=40) to individuals elevated on social anhedonia (n=30), and individuals elevated on perceptual aberration/magical ideation (n=30) on the Effort Expenditure for Rewards Task (EEfRT). Across groups, participants chose a higher proportion of hard tasks with increasing probability of reward and reward magnitude, demonstrating sensitivity to probability and reward values. Contrary to our expectations, when the probability of reward was most uncertain (50% probability), at low and medium reward values, the social anhedonia group demonstrated more effortful decision making than either individuals high in positive schizotypy or controls. The positive schizotypy group only differed from controls (making less effortful choices than controls) when reward probability was lowest (12%) and the magnitude of reward was the smallest. Our results suggest that social anhedonia is related to intact motivation and effort for monetary rewards, but that individuals with this characteristic display a unique and perhaps inefficient pattern of effort allocation when the probability of reward is most uncertain. Future research is needed to better understand effortful decision making and the processing of reward across a range of individual difference characteristics. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Working from Home - What is the Effect on Employees' Effort?

    OpenAIRE

    Rupietta, Kira; Beckmann, Michael

    2016-01-01

    This paper investigates how working from home affects employees' work effort. Employees, who have the possibility to work from home, have a high autonomy in scheduling their work and therefore are assumed to have a higher intrinsic motivation. Thus, we expect working from home to positively influence work effort of employees. For the empirical analysis we use the German Socio-Economic Panel (SOEP). To account for self-selection into working locations we use an instrumental variable (IV) estim...

  4. Goddard Technology Efforts to Improve Space Borne Laser Reliability

    Science.gov (United States)

    Heaps, William S.

    2006-01-01

    In an effort to reduce the risk, perceived and actual, of employing instruments containing space borne lasers NASA initiated the Laser Risk Reduction Program (LRRP) in 2001. This program managed jointly by NASA Langley and NASA Goddard and employing lasers researchers from government, university and industrial labs is nearing the conclusion of its planned 5 year duration. This paper will describe some of the efforts and results obtained by the Goddard half of the program.

  5. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  6. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  7. Job Satisfaction, Effort, and Performance: A Reasoned Action Perspective

    Directory of Open Access Journals (Sweden)

    Icek Ajzen

    2011-12-01

    Full Text Available In this article the author takes issue with the recurrent reliance on job satisfaction to explain job-related effort and performance.  The disappointing findings in this tradition are explained by lack of compatibility between job satisfaction–-a very broad attitude–-and the more specific effort and performance criteria.  Moreover, attempts to apply the expectancy-value model of attitude to explore the determinants of effort and performance suffer from reliance on unrepresentative sets of beliefs about the likely consequences of these behaviors.  The theory of planned behavior (Ajzen, 1991, 2012, with its emphasis on the proximal antecedents of job effort and performance, is offered as an alternative.  According to the theory, intentions to exert effort and to attain a certain performance level are determined by attitudes, subjective norms, and perceptions of control in relation to these behaviors; and these variables, in turn, are a function of readily accessible beliefs about the likely outcomes of effort and performance, about the normative expectations of important others, and about factors that facilitate or hinder effective performance.

  8. The RBANS Effort Index: base rates in geriatric samples.

    Science.gov (United States)

    Duff, Kevin; Spering, Cynthia C; O'Bryant, Sid E; Beglinger, Leigh J; Moser, David J; Bayless, John D; Culp, Kennith R; Mold, James W; Adams, Russell L; Scott, James G

    2011-01-01

    The Effort Index (EI) of the RBANS was developed to assist clinicians in discriminating patients who demonstrate good effort from those with poor effort. However, there are concerns that older adults might be unfairly penalized by this index, which uses uncorrected raw scores. Using five independent samples of geriatric patients with a broad range of cognitive functioning (e.g., cognitively intact, nursing home residents, probable Alzheimer's disease), base rates of failure on the EI were calculated. In cognitively intact and mildly impaired samples, few older individuals were classified as demonstrating poor effort (e.g., 3% in cognitively intact). However, in the more severely impaired geriatric patients, over one third had EI scores that fell above suggested cutoff scores (e.g., 37% in nursing home residents, 33% in probable Alzheimer's disease). In the cognitively intact sample, older and less educated patients were more likely to have scores suggestive of poor effort. Education effects were observed in three of the four clinical samples. Overall cognitive functioning was significantly correlated with EI scores, with poorer cognition being associated with greater suspicion of low effort. The current results suggest that age, education, and level of cognitive functioning should be taken into consideration when interpreting EI results and that significant caution is warranted when examining EI scores in elders suspected of having dementia.

  9. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  10. Effects of Transformational and Transactional Leadership on Cognitive Effort and Outcomes during Collaborative Learning within a Virtual World

    Science.gov (United States)

    Kahai, Surinder; Jestire, Rebecca; Huang, Rui

    2013-01-01

    Computer-supported collaborative learning is a common e-learning activity. Instructors have to create appropriate social and instructional interventions in order to promote effective learning. We performed a study that examined the effects of two popular leadership interventions, transformational and transactional, on cognitive effort and outcomes…

  11. Non-additive measures theory and applications

    CERN Document Server

    Narukawa, Yasuo; Sugeno, Michio; 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012)

    2014-01-01

    This book provides a comprehensive and timely report in the area of non-additive measures and integrals. It is based on a panel session on fuzzy measures, fuzzy integrals and aggregation operators held during the 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012) in Girona, Spain, November 21-23, 2012. The book complements the MDAI 2012 proceedings book, published in Lecture Notes in Computer Science (LNCS) in 2012. The individual chapters, written by key researchers in the field, cover fundamental concepts and important definitions (e.g. the Sugeno integral, definition of entropy for non-additive measures) as well some important applications (e.g. to economics and game theory) of non-additive measures and integrals. The book addresses students, researchers and practitioners working at the forefront of their field.  

  12. A hardness result for core stability in additive hedonic games

    NARCIS (Netherlands)

    Woeginger, G.J.

    2013-01-01

    We investigate the computational complexity of a decision problem in hedonic coalition formation games. We prove that core stability in additive hedonic games is complete for the second level of the polynomial hierarchy.

  13. Biomechanical Comparison of Three Perceived Effort Set Shots in Team Handball Players.

    Science.gov (United States)

    Plummer, Hillary A; Gascon, Sarah S; Oliver, Gretchen D

    2017-01-01

    Plummer, HA, Gascon, SS, and Oliver, GD. Biomechanical comparison of three perceived effort set shots in team handball players. J Strength Cond Res 31(1): 80-87, 2017-Shoulder injuries are prevalent in the sport of team handball; however, no guidelines currently exist in the implementation of an interval throwing protocol for players returning from an upper extremity injury. These guidelines exist for the sport of baseball, but team handball may present additional challenges due to greater ball mass that must be accounted for. The purpose of this study was to examine kinematic differences in the team handball set shot at 50, 75, and 100% effort which are common throwing intensities in throwing protocols. Eleven male team handball players (23.09 ± 3.05 years; 185.12 ± 8.33 cm; 89.65 ± 12.17 kg) volunteered. An electromagnetic tracking system was used to collect kinematic data at the pelvis, trunk, scapula, and shoulder. Kinematic differences at the shoulder, trunk, and pelvis were observed across effort levels throughout the set shot with most occurring at ball release and maximum internal rotation. Significant differences in ball speed were observed between all 3 effort level shots (p handball players are able to gauge the effort at which they shoot; however, it cannot be assumed that these speeds will be at a certain percentage of their maximum. The results of this study provide valuable evidence that can be used to prepare a team handball player to return to throwing activities.

  14. Why don't you try harder? An investigation of effort production in major depression.

    Directory of Open Access Journals (Sweden)

    Marie-Laure Cléry-Melin

    Full Text Available Depression is mainly characterized as an emotional disorder, associated with reduced approach behavior. It remains unclear whether the difficulty in energising behavior relates to abnormal emotional states or to a flattened response to potential rewards, as suggested by several neuroimaging studies. Here, we aimed to demonstrate a specific incentive motivation deficit in major depression, independent of patients' emotional state. We employed a behavioral paradigm designed to measure physical effort in response to both emotional modulation and incentive motivation. Patients did exert more effort following emotionally arousing pictures (whether positive or negative but not for higher monetary incentives, contrary to healthy controls. These results show that emotional and motivational sources of effort production are dissociable in pathological conditions. In addition, patients' ratings of perceived effort increased for high incentives, whereas controls' ratings were decreased. Thus, depressed patients objectively behave as if they do not want to gain larger rewards, but subjectively feel that they try harder. We suggest that incentive motivation impairment is a core deficit of major depression, which may render everyday tasks abnormally effortful for patients.

  15. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  16. Cerebral blood flow, fatigue, mental effort, and task performance in offices with two different pollution loads

    DEFF Research Database (Denmark)

    Nishihara, Naoe; Wargocki, Pawel; Tanabe, Shin-ichi

    2014-01-01

    The effects of indoor air quality on symptoms, perceptions, task performance, cerebral blood flow, fatigue, and mental effort of individuals working in an office were investigated. Twenty-four right-handed Danish female subjects in an office were exposed in groups of two at a time to two air...... pollution levels created by placing or removing a pollution source (i.e. a used carpet) behind a screen. During the exposure, the subjects performed four different office tasks presented on a computer monitor. The tasks were performed at two paces: normal and maximum. When the pollution source was present...... any effects caused by modifying pollution exposure, they were well correlated with increased mental effort when the tasks were performed at maximum pace and subjectively reported fatigue, which increased during the course of exposure, respectively....

  17. ADDITIVES USED TO OBTAIN FOOD

    Directory of Open Access Journals (Sweden)

    Dorina Ardelean

    2012-01-01

    Full Text Available Use of food additives in food is determined by the growth of contemporary food needs of the world population. Additives used in food, both natural and artificial ones, contribute to: improving the organoleptic characteristics and to preserve the food longer, but we must not forget that all these additives should not be found naturally in food products. Some of these additives are not harmful and human pests in small quantities, but others may have harmful effects on health.

  18. Appointment scheduling on computer.

    Science.gov (United States)

    Mercando, A D

    1997-07-01

    The program is well-written, intuitive, and easy to use once initial data, such as the available appointment slots, has been entered. While the effort may not seem worthwhile initially, the ability to access an office appointment book from several locations simultaneously and the reporting capabilities of the software make MEDSched a useful addition to any busy office practice or clinic. Please send your comments and suggestions to me at adm4@columbia.edu.

  19. Audit Report "Department of Energy Efforts to Manage Information Technology Resources in an Energy-Efficient and Environmentally Responsible Manner"

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-05-01

    The American Recovery and Reinvestment Act of 2009 emphasizes energy efficiency and conservation as critical to the Nation's economic vitality; its goal of reducing dependence on foreign energy sources; and, related efforts to improve the environment. The Act highlights the significant use of various forms of energy in the Federal sector and promotes efforts to improve the energy efficiency of Federal operations. One specific area of interest is the increasing demand for Federal sector computing resources and the corresponding increase in energy use, with both cost and environmental implications. The U.S. Environmental Protection Agency reported that, without aggressive conservation measures, data center energy consumption alone is expected to double over the next five years. In our report on Management of the Department's Data Centers at Contractor Sites (DOE/IG-0803, October 2008) we concluded that the Department of Energy had not always improved the efficiency of its contractor data centers even when such modifications were possible and practical. Despite its recognized energy conservation leadership role, the Department had not always taken advantage of opportunities to reduce energy consumption associated with its information technology resources. Nor, had it ensured that resources were managed in a way that minimized impact on the environment. In particular: (1) The seven Federal and contractor sites included in our review had not fully reduced energy consumption through implementation of power management settings on their desktop and laptop computers; and, as a consequence, spent $1.6 million more on energy costs than necessary in Fiscal Year 2008; (2) None of the sites reviewed had taken advantage of opportunities to reduce energy consumption, enhance cyber security, and reduce costs available through the use of techniques, such as 'thin-client computing' in their unclassified environments; and, (3) Sites had not always taken the

  20. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.