WorldWideScience

Sample records for previous computer experience

  1. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  2. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  3. Specific Previous Experience Affects Perception of Harmony and Meter

    Science.gov (United States)

    Creel, Sarah C.

    2011-01-01

    Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was…

  4. COMPUTER CONTROL OF BEHAVIORAL EXPERIMENTS.

    Science.gov (United States)

    SIEGEL, LOUIS

    THE LINC COMPUTER PROVIDES A PARTICULAR SCHEDULE OF REINFORCEMENT FOR BEHAVIORAL EXPERIMENTS BY EXECUTING A SEQUENCE OF COMPUTER OPERATIONS IN CONJUNCTION WITH A SPECIALLY DESIGNED INTERFACE. THE INTERFACE IS THE MEANS OF COMMUNICATION BETWEEN THE EXPERIMENTAL CHAMBER AND THE COMPUTER. THE PROGRAM AND INTERFACE OF AN EXPERIMENT INVOLVING A PIGEON…

  5. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...... of configurations that can be tested is limited. Papers B and C introduce a novel experimental plan for simulation models having two types of input factors. The plan differentiates between factors that can be controlled in both the simulation model and the physical system and factors that are only controllable...... in the simulation model but simply observed in the physical system. Factors that only are controllable in the simulation model are called uncontrollable factors and they correspond to the environmental factors in fluencing the physical system. Applying the experimental framework on the simulation model in Paper...

  6. Computed radiography - our experience

    International Nuclear Information System (INIS)

    Williams, C.

    1997-01-01

    Computed Radiography (CR) is the digital acquisition of plain X-ray images using phosphor plate technology. This allows post- processing and transmission of images to remote locations. St. Vincent's Public Hospital in Melbourne has had the benefit of two separate CR systems which have been implemented over the past three years. CR is a significant advance in radiographic imaging and is evolving continuously. The last few years have been a period of change and development for all staff which has proved both challenging and rewarding. Further development is required before the system is implemented completely. (author)

  7. The Role of Previous Experience and Attitudes toward Statistics in Statistics Assessment Outcomes among Undergraduate Psychology Students

    Science.gov (United States)

    Dempster, Martin; McCorry, Noleen K.

    2009-01-01

    Previous research has demonstrated that students' cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students' previous experiences of maths, statistics and computing; their attitudes toward statistics;…

  8. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of

  9. The role of previous experience and attitudes toward statistics in statistics assessment outcomes among undergraduate psychology students

    OpenAIRE

    Dempster, Martin; McCorry, Noleen

    2009-01-01

    Previous research has demonstrated that students’ cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students’ previous experiences of maths, statistics and computing; their attitudes toward statistics; and assessment on a statistics course. Of the variables examined, the strongest predictor of assessment outcome was students’ attitude about their in...

  10. Analysis of previous perceptual and motor experience in breaststroke kick learning

    Directory of Open Access Journals (Sweden)

    Ried Bettina

    2015-12-01

    Full Text Available One of the variables that influence motor learning is the learner’s previous experience, which may provide perceptual and motor elements to be transferred to a novel motor skill. For swimming skills, several motor experiences may prove effective. Purpose. The aim was to analyse the influence of previous experience in playing in water, swimming lessons, and music or dance lessons on learning the breaststroke kick. Methods. The study involved 39 Physical Education students possessing basic swimming skills, but not the breaststroke, who performed 400 acquisition trials followed by 50 retention and 50 transfer trials, during which stroke index as well as rhythmic and spatial configuration indices were mapped, and answered a yes/no questionnaire regarding previous experience. Data were analysed by ANOVA (p = 0.05 and the effect size (Cohen’s d ≥0.8 indicating large effect size. Results. The whole sample improved their stroke index and spatial configuration index, but not their rhythmic configuration index. Although differences between groups were not significant, two types of experience showed large practical effects on learning: childhood water playing experience only showed major practically relevant positive effects, and no experience in any of the three fields hampered the learning process. Conclusions. The results point towards diverse impact of previous experience regarding rhythmic activities, swimming lessons, and especially with playing in water during childhood, on learning the breaststroke kick.

  11. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  12. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  13. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  14. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  15. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  16. Long- term effects of previous experience determine nutrient discrimination abilities in birds

    Directory of Open Access Journals (Sweden)

    Spitzer Kathrin

    2008-02-01

    Full Text Available Abstract Background Foraging behaviour is an essential ecological process linking different trophic levels. A central assumption of foraging theory is that food selection maximises the fitness of the consumer. It remains unknown, however, whether animals use innate or learned behaviour to discriminate food rewards. While many studies demonstrated that previous experience is a strong determinant of complex food choices such as diet mixing, the response to simple nutritional stimuli, such as sugar concentrations, is often believed to be innate. Results Here we show that previous experience determines the ability to track changes in sugar composition in same-aged individuals of a short-lived migratory songbird, the garden warbler (Sylvia borin. Although birds received identical foods for seven months prior to the experiment, wild-caught birds achieved higher sugar intake rates than hand-raised birds when confronted with alternative, differently coloured, novel food types. Hand-raised and wild birds did not differ in their initial colour selection or overall food intake, but wild birds were quicker to adjust food choice to varying sugar intake. Conclusion Over a period of at least seven months, broader previous experience translates into a higher plasticity of food choice leading to higher nutrient intake. Our results thus highlight the need to address previous long-term experience in foraging experiments. Furthermore, they show that hand-raised animals are often poor surrogates for testing the foraging behaviour of wild animals.

  17. Low-dose computed tomography image restoration using previous normal-dose scan

    Science.gov (United States)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series.Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols.Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use of

  18. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  19. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance

    NARCIS (Netherlands)

    Schripsema, Nienke R.; Trigt, van Anke M.; Borleffs, Jan C. C.; Cohen-Schotanus, Janke

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree

  20. Impact of Vocational Interests, Previous Academic Experience, Gender and Age on Situational Judgement Test Performance

    Science.gov (United States)

    Schripsema, Nienke R.; van Trigt, Anke M.; Borleffs, Jan C. C.; Cohen-Schotanus, Janke

    2017-01-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the…

  1. Simulatedin vivoElectrophysiology Experiments Provide Previously Inaccessible Insights into Visual Physiology.

    Science.gov (United States)

    Quiroga, Maria; Price, Nicholas SC

    2016-01-01

    Lecture content and practical laboratory classes are ideally complementary. However, the types of experiments that have led to our detailed understanding of sensory neuroscience are often not amenable to classroom experimentation as they require expensive equipment, time-consuming surgeries, specialized experimental techniques, and the use of animals. While sometimes feasible in small group teaching, these experiments are not suitable for large cohorts of students. Previous attempts to expose students to sensory neuroscience experiments include: the use of electrophysiology preparations in invertebrates, data-driven simulations that do not replicate the experience of conducting an experiment, or simply observing an experiment in a research laboratory. We developed an online simulation of a visual neuroscience experiment in which extracellular recordings are made from a motion sensitive neuron. Students have control over stimulation parameters (direction and contrast) and can see and hear the action potential responses to stimuli as they are presented. The simulation provides an intuitive way for students to gain insight into neurophysiology, including experimental design, data collection and data analysis. Our simulation allows large cohorts of students to cost-effectively "experience" the results of animal research without ethical concerns, to be exposed to realistic data variability, and to develop their understanding of how sensory neuroscience experiments are conducted.

  2. Impact of Previous Pharmacy Work Experience on Pharmacy School Academic Performance

    Science.gov (United States)

    Mar, Ellena; T-L Tang, Terrill; Sasaki-Hill, Debra; Kuperberg, James R.; Knapp, Katherine

    2010-01-01

    Objectives To determine whether students' previous pharmacy-related work experience was associated with their pharmacy school performance (academic and clinical). Methods The following measures of student academic performance were examined: pharmacy grade point average (GPA), scores on cumulative high-stakes examinations, and advanced pharmacy practice experience (APPE) grades. The quantity and type of pharmacy-related work experience each student performed prior to matriculation was solicited through a student survey instrument. Survey responses were correlated with academic measures, and demographic-based stratified analyses were conducted. Results No significant difference in academic or clinical performance between those students with prior pharmacy experience and those without was identified. Subanalyses by work setting, position type, and substantial pharmacy work experience did not reveal any association with student performance. A relationship was found, however, between age and work experience, ie, older students tended to have more work experience than younger students. Conclusions Prior pharmacy work experience did not affect students' overall academic or clinical performance in pharmacy school. The lack of significant findings may have been due to the inherent practice limitations of nonpharmacist positions, changes in pharmacy education, and the limitations of survey responses. PMID:20498735

  3. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    Science.gov (United States)

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, pintelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  4. Stress and blood donation: effects of music and previous donation experience.

    Science.gov (United States)

    Ferguson, E; Singh, A P; Cunningham-Snell, N

    1997-05-01

    Making a blood donation, especially for first-time donors, can be a stressful experience. These feelings of stress may inhibit donors from returning. This paper applies stress theory to this particular problem. The effects of a stress management intervention (the provision of music) and previous donor experience were examined in relation to pre- and post-donation mood, environmental appraisals and coping behaviour. Results indicated that the provision of music had detrimental effects on environmental appraisals for those who have donated up to two times previously, but beneficial effects for those who had donated three times before. These effects were, to an extent, moderated by coping processes but not perceived control. It is recommended that the provision of music is not used as a stress management technique in the context of blood donation.

  5. Important biological information uncovered in previously unaligned reads from chromatin immunoprecipitation experiments (ChIP-Seq)

    Science.gov (United States)

    Ouma, Wilberforce Zachary; Mejia-Guerra, Maria Katherine; Yilmaz, Alper; Pareja-Tobes, Pablo; Li, Wei; Doseff, Andrea I.; Grotewold, Erich

    2015-01-01

    Establishing the architecture of gene regulatory networks (GRNs) relies on chromatin immunoprecipitation followed by massively parallel sequencing (ChIP-Seq) methods that provide genome-wide transcription factor binding sites (TFBSs). ChIP-Seq furnishes millions of short reads that, after alignment, describe the genome-wide binding sites of a particular TF. However, in all organisms investigated an average of 40% of reads fail to align to the corresponding genome, with some datasets having as much as 80% of reads failing to align. We describe here the provenance of previously unaligned reads in ChIP-Seq experiments from animals and plants. We show that a substantial portion corresponds to sequences of bacterial and metazoan origin, irrespective of the ChIP-Seq chromatin source. Unforeseen was the finding that 30%–40% of unaligned reads were actually alignable. To validate these observations, we investigated the characteristics of the previously unaligned reads corresponding to TAL1, a human TF involved in lineage specification of hemopoietic cells. We show that, while unmapped ChIP-Seq read datasets contain foreign DNA sequences, additional TFBSs can be identified from the previously unaligned ChIP-Seq reads. Our results indicate that the re-evaluation of previously unaligned reads from ChIP-Seq experiments will significantly contribute to TF target identification and determination of emerging properties of GRNs. PMID:25727450

  6. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance.

    Science.gov (United States)

    Schripsema, Nienke R; van Trigt, Anke M; Borleffs, Jan C C; Cohen-Schotanus, Janke

    2017-05-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the Netherlands. All applicants for the academic year 2015-2016 were included and had to choose between learning communities Global Health (n = 126), Sustainable Care (n = 149), Intramural Care (n = 225), or Molecular Medicine (n = 116). This choice was used as a proxy for vocational interest. In addition, all graduate-entry applicants for academic year 2015-2016 (n = 213) were included to examine the effect of previous academic experience on performance. We used MANCOVA analyses with Bonferroni post hoc multiple comparisons tests for applicant performance on a six-scenario SJT. The MANCOVA analyses showed that for all scenarios, the independent variables were significantly related to performance (Pillai's Trace: 0.02-0.47, p performance on three scenarios (p performance on two scenarios (p performance, as was previous academic experience. Gender and age were related to performance on SJT scenarios in different settings. Especially the first effect might be helpful in selecting appropriate candidates for areas of health care in which more professionals are needed.

  7. Computing challenges of the CMS experiment

    Science.gov (United States)

    Krammer, N.; Liko, D.

    2017-06-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  8. Gender, Computer Experience and Computer-Based Problem Solving.

    Science.gov (United States)

    Joiner, Richard; And Others

    1996-01-01

    Reports the results of a study of 65 United Kingdom primary school children that examined the effect of software type by comparing children's performance on a male stereotyped version of the software with a female stereotyped version. Topics include computer attitudes, computer experience, and software preferences. (Author/LRW)

  9. "My math and me": Nursing students' previous experiences in learning mathematics.

    Science.gov (United States)

    Røykenes, Kari

    2016-01-01

    In this paper, 11 narratives about former experiences in learning of mathematics written by nursing students are thematically analyzed. Most students had a positive relationship with the subject in primary school, when they found mathematics fun and were able to master the subject. For some, a change occurred in the transition to lower secondary school. The reasons for this change was found in the subject (increased difficulty), the teachers (movement of teachers, numerous substitute teachers), the class environment and size (many pupils, noise), and the student him- or herself (silent and anonymous pupil). This change was also found in the transition from lower to higher secondary school. By contrast, some students had experienced changes that were positive, and their mathematics teacher was a significant factor in this positive change. The paper emphasizes the importance of previous experiences in learning mathematics to nursing students when learning about drug calculation. Copyright © 2015. Published by Elsevier Ltd.

  10. Differences between previously married and never married 'gay' men: family background, childhood experiences and current attitudes.

    Science.gov (United States)

    Higgins, Daryl J

    2004-01-01

    Despite a large body of literature on the development of sexual orientation, little is known about why some gay men have been (or remain) married to a woman. In the current study, a self-selected sample of 43 never married gay men ('never married') and 26 gay men who were married to a woman ('previously married') completed a self-report questionnaire. Hypotheses were based on five possible explanations for gay men's marriages: (a) differences in sexual orientation (i.e., bisexuality); (b) internalized homophobia; (c) religious intolerance; (d) confusion created because of childhood/adolescent sexual experiences; and/or (e) poor psychological adjustment. Previously married described their families' religious beliefs as more fundamentalist than never married. No differences were found between married' and never married' ratings of their sexual orientation and identity, and levels of homophobia and self-depreciation. Family adaptability and family cohesion and the degree to which respondents reported having experienced child maltreatment did not distinguish between previously married and never married. The results highlight how little is understood of the reasons why gay men marry, and the need to develop an adequate theoretical model.

  11. Democratizing Children's Computation: Learning Computational Science as Aesthetic Experience

    Science.gov (United States)

    Farris, Amy Voss; Sengupta, Pratim

    2016-01-01

    In this essay, Amy Voss Farris and Pratim Sengupta argue that a democratic approach to children's computing education in a science class must focus on the "aesthetics" of children's experience. In "Democracy and Education," Dewey links "democracy" with a distinctive understanding of "experience." For Dewey,…

  12. Reciprocity, culture and human cooperation: previous insights and a new cross-cultural experiment.

    Science.gov (United States)

    Gächter, Simon; Herrmann, Benedikt

    2009-03-27

    Understanding the proximate and ultimate sources of human cooperation is a fundamental issue in all behavioural sciences. In this paper, we review the experimental evidence on how people solve cooperation problems. Existing studies show without doubt that direct and indirect reciprocity are important determinants of successful cooperation. We also discuss the insights from a large literature on the role of peer punishment in sustaining cooperation. The experiments demonstrate that many people are 'strong reciprocators' who are willing to cooperate and punish others even if there are no gains from future cooperation or any other reputational gains. We document this in new one-shot experiments, which we conducted in four cities in Russia and Switzerland. Our cross-cultural approach allows us furthermore to investigate how the cultural background influences strong reciprocity. Our results show that culture has a strong influence on positive and in especially strong negative reciprocity. In particular, we find large cross-cultural differences in 'antisocial punishment' of pro-social cooperators. Further cross-cultural research and experiments involving different socio-demographic groups document that the antisocial punishment is much more widespread than previously assumed. Understanding antisocial punishment is an important task for future research because antisocial punishment is a strong inhibitor of cooperation.

  13. The relationship of previous training and experience of journal peer reviewers to subsequent review quality.

    Directory of Open Access Journals (Sweden)

    Michael L Callaham

    2007-01-01

    Full Text Available BACKGROUND: Peer review is considered crucial to the selection and publication of quality science, but very little is known about the previous experiences and training that might identify high-quality peer reviewers. The reviewer selection processes of most journals, and thus the qualifications of their reviewers, are ill defined. More objective selection of peer reviewers might improve the journal peer review process and thus the quality of published science. METHODS AND FINDINGS: 306 experienced reviewers (71% of all those associated with a specialty journal completed a survey of past training and experiences postulated to improve peer review skills. Reviewers performed 2,856 reviews of 1,484 separate manuscripts during a four-year study period, all prospectively rated on a standardized quality scale by editors. Multivariable analysis revealed that most variables, including academic rank, formal training in critical appraisal or statistics, or status as principal investigator of a grant, failed to predict performance of higher-quality reviews. The only significant predictors of quality were working in a university-operated hospital versus other teaching environment and relative youth (under ten years of experience after finishing training. Being on an editorial board and doing formal grant (study section review were each predictors for only one of our two comparisons. However, the predictive power of all variables was weak. CONCLUSIONS: Our study confirms that there are no easily identifiable types of formal training or experience that predict reviewer performance. Skill in scientific peer review may be as ill defined and hard to impart as is "common sense." Without a better understanding of those skills, it seems unlikely journals and editors will be successful in systematically improving their selection of reviewers. This inability to predict performance makes it imperative that all but the smallest journals implement routine review ratings

  14. Sexual Liberalism-Conservatism: the effect of human values, gender, and previous sexual experience.

    Science.gov (United States)

    Guerra, Valeschka M; Gouveia, Valdiney V; Sousa, Deliane M; Lima, Tiago J; Freires, Leogildo A

    2012-08-01

    Despite theoretical associations, there is a lack of empirical studies on the axiological basis of sexual liberalism-conservatism. Two studies demonstrated important associations between these constructs for young adults. In Study 1, participants were 353 undergraduate students with a mean age of 20.13 (SD = 1.84), who completed the Sexual Liberalism-Conservatism Scale and the Basic Values Survey. In Study 2, participants were 269 undergraduate students, with a mean age of 20.3 (SD = 1.82), who completed a social desirability scale in addition to Study 1 instruments. Results showed how values can predict sexual liberalism-conservatism after controlling for social desirability. Attitudes towards one's own sexual behavior were more conservative whereas attitudes towards other's sexual behavior were more liberal. Gender was not a significant predictor of sexual attitudes whereas previous sexual experience showed a significant association to this construct. In general, results corroborated previous findings, showing that participants with a tendency to present socially desirable answers also tended to present themselves as sexually conservative.

  15. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  16. Computing experiments on stellar systems

    CERN Document Server

    Bouvier, P

    1972-01-01

    A stellar system being usually conceived, in a first approximation, as a group of point-like stars held together by their own gravitational mutual attraction, one may discriminate between three or four different lines of attack on the problem of the dynamical evolution of such a system. These are the straight-forward integration of the n- body problem, the statistical model description, the Monte Carlo technique, the Boltzmann moment approach. Direct numerical integration can now be applied to the dynamical evolution of star clusters containing up to 500 stars, which includes small to medium open stellar clusters, while statistical and Monte Carlo descriptions are better suited for systems of at least several thousand stars. The overall dynamic evolution of an isolated star cluster is characterized by the formation of a dense core surrounded by an extended halo, with some stars escaping with positive energy. This general feature has been confirmed in all the numerical experiments carried out in the last ten y...

  17. The Computer Game as a Somatic Experience

    DEFF Research Database (Denmark)

    Nielsen, Henrik Smed

    2010-01-01

    This article describes the experience of playing computer games. With a media archaeological outset the relation between human and machine is emphasised as the key to understand the experience. This relation is further explored by drawing on a phenomenological philosophy of technology which sketc...

  18. Computing and data handling recent experiences at Fermilab and SLAC

    International Nuclear Information System (INIS)

    Cooper, P.S.

    1990-01-01

    Computing has become evermore central to the doing of high energy physics. There are now major second and third generation experiments for which the largest single cost is computing. At the same time the availability of ''cheap'' computing has made possible experiments which were previously considered infeasible. The result of this trend has been an explosion of computing and computing needs. I will review here the magnitude of the problem, as seen at Fermilab and SLAC, and the present methods for dealing with it. I will then undertake the dangerous assignment of projecting the needs and solutions forthcoming in the next few years at both laboratories. I will concentrate on the ''offline'' problem; the process of turning terabytes of data tapes into pages of physics journals. 5 refs., 4 figs., 4 tabs

  19. Decomposing experience-driven attention: opposite attentional effects of previously predictive cues

    Science.gov (United States)

    Lin, Zhicheng; Lu, Zhong-Lin; He, Sheng

    2016-01-01

    A central function of the brain is to track the dynamic statistical regularities in the environment—such as what predicts what over time. How does this statistical learning process alter sensory and attentional processes? Drawing upon animal conditioning and predictive coding, we developed a learning procedure that revealed two distinct components through which prior learning-experience controls attention. During learning, a visual search task was used in which the target randomly appeared at one of several locations but always inside an encloser of a particular color—the learned color served to direct attention to the target location. During test, the color no longer predicted the target location. When the same search task was used in the subsequent test, we found that the learned color continued to attract attention despite the behavior being counterproductive for the task and despite the presence of a completely predictive cue. However, when tested with a flanker task that had minimal location uncertainty—the target was at the fixation surrounded by a distractor—participants were better at ignoring distractors in the learned color than other colors. Evidently, previously predictive cues capture attention in the same search task but can be better suppressed in a flanker task. These results demonstrate opposing components—capture and inhibition—in experience-driven attention, with their manifestations crucially dependent on task context. We conclude that associative learning enhances context-sensitive top-down modulation while reduces bottom-up sensory drive and facilitates suppression, supporting a learning-based predictive coding account. PMID:27068051

  20. Do previous sports experiences influence the effect of an enrichment programme in basketball skills?

    Science.gov (United States)

    Santos, Sara; Mateus, Nuno; Sampaio, Jaime; Leite, Nuno

    2017-09-01

    The aim of this study was to examine the effect of an enrichment programme in motor, technical and tactical basketball skills, when accounting for the age of youth sport specialisation. Seventy-six college students (age: M = 20.4, SD = 1.9) were allocated according to three different paths: (i) non-structured (n = 14), (ii) early specialisation (n = 34), and (iii) late specialisation (n = 28), according to information previously provided by the participants about the quantity and type of sporting activities performed throughout their sporting careers. Then, the participants of each path were randomly distributed across control and experimental groups. Variables under study included agility, technical skills circuit, as well as tactical actions performed in a 4-on-4 full-court basketball game. The results indicated improvements in the early and late specialisation paths namely in the experimental training groups. However, the late specialisation path revealed larger benefits, in contrast with the non-structured path, which showed less sensitivity to the enrichment programme, mostly sustained in physical literacy and differential learning. Higher improvements were observed in agility, and also in reducing the number of unsuccessful actions performed during the game. Overall, this study provided evidence of how early sports experiences affect basketball skill acquisition and contribute to adapt to new contexts with motor and technical-tactical challenges. In addition, a path supported by late specialisation might present several advantages in sport performance achievement.

  1. TU-CD-BRD-01: Making Incident Learning Practical and Useful: Challenges and Previous Experiences

    International Nuclear Information System (INIS)

    Ezzell, G.

    2015-01-01

    It has long been standard practice in radiation oncology to report internally when a patient’s treatment has not gone as planned and to report events to regulatory agencies when legally required. Most potential errors are caught early and never affect the patient. Quality assurance steps routinely prevent errors from reaching the patient, and these “near misses” are much more frequent than treatment errors. A growing number of radiation oncology facilities have implemented incident learning systems to report and analyze both errors and near misses. Using the term “incident learning” instead of “event reporting” emphasizes the need to use these experiences to change the practice and make future errors less likely and promote an educational, non-punitive environment. There are challenges in making such a system practical and effective. Speakers from institutions of different sizes and practice environments will share their experiences on how to make such a system work and what benefits their clinics have accrued. Questions that will be addressed include: How to create a system that is easy for front line staff to access How to motivate staff to report How to promote the system as positive and educational and not punitive or demeaning How to organize the team for reviewing and responding to reports How to prioritize which reports to discuss in depth How not to dismiss the rest How to identify underlying causes How to design corrective actions and implement change How to develop useful statistics and analysis tools How to coordinate a departmental system with a larger risk management system How to do this without a dedicated quality manager Some speakers’ experience is with in-house systems and some will share experience with the AAPM/ASTRO national Radiation Oncology Incident Learning System (RO-ILS). Reports intended to be of value nationally need to be comprehensible to outsiders; examples of useful reports will be shown. There will be ample time set

  2. Computer Controlled Photometer/Planck Curve Experiment

    Science.gov (United States)

    Dupuy, David L.; Peters, Philip B.

    2001-11-01

    Developed as a demo in our course for computer control of laboratory experiments, this experiment had two goals: to attempt to measure the output of a tungsten bulb over a wide range of wavelengths, and to test the use of LabVIEW as a programming language for teaching experiment control. The brightness readings were corrected for instrumental effects and fitted with a Planck curve. The experiment involved digital input, digital output to a microstepper controller to move the filter wheel, and analog input. Results will be shown for the Planck curve and the LabVIEW program.

  3. Previous experience of family violence and intimate partner violence in pregnancy

    Directory of Open Access Journals (Sweden)

    Ana Bernarda Ludermir

    2017-09-01

    Full Text Available ABSTRACT OBJECTIVE To estimate differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. METHODS A nested case-control study was carried out within a cohort study with 1,120 pregnant women aged 18–49 years old, who were registered in the Family Health Strategy of the city of Recife, State of Pernambuco, Brazil, between 2005 and 2006. The cases were the 233 women who reported intimate partner violence in pregnancy and the controls were the 499 women who did not report it. Partner violence in pregnancy and previous experiences of violence committed by parents or other family members were assessed with a standardized questionnaire. Multivariate logistic regression analyses were modeled to identify differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. RESULTS Having seen the mother suffer intimate partner violence was associated with physical violence in childhood (OR = 2.62; 95%CI 1.89–3.63 and in adolescence (OR = 1.47; 95%CI 1.01–2.13, sexual violence in childhood (OR = 3.28; 95%CI 1.68–6.38 and intimate partner violence during pregnancy (OR = 1.47; 95% CI 1.01 – 2.12. The intimate partner violence during pregnancy was frequent in women who reported more episodes of physical violence in childhood (OR = 2.08; 95%CI 1.43–3.02 and adolescence (OR = 1.63; 95%CI 1.07–2.47, who suffered sexual violence in childhood (OR = 3.92; 95%CI 1.86–8.27, and who perpetrated violence against the partner (OR = 8.67; 95%CI 4.57–16.45. CONCLUSIONS Experiences of violence committed by parents or other family members emerge as strong risk factors for intimate partner violence in pregnancy. Identifying and understanding protective and risk factors for the emergence of intimate partner violence in pregnancy and its maintenance may help

  4. Previous experience of family violence and intimate partner violence in pregnancy.

    Science.gov (United States)

    Ludermir, Ana Bernarda; Araújo, Thália Velho Barreto de; Valongueiro, Sandra Alves; Muniz, Maria Luísa Corrêa; Silva, Elisabete Pereira

    2017-01-01

    To estimate differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. A nested case-control study was carried out within a cohort study with 1,120 pregnant women aged 18-49 years old, who were registered in the Family Health Strategy of the city of Recife, State of Pernambuco, Brazil, between 2005 and 2006. The cases were the 233 women who reported intimate partner violence in pregnancy and the controls were the 499 women who did not report it. Partner violence in pregnancy and previous experiences of violence committed by parents or other family members were assessed with a standardized questionnaire. Multivariate logistic regression analyses were modeled to identify differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. Having seen the mother suffer intimate partner violence was associated with physical violence in childhood (OR = 2.62; 95%CI 1.89-3.63) and in adolescence (OR = 1.47; 95%CI 1.01-2.13), sexual violence in childhood (OR = 3.28; 95%CI 1.68-6.38) and intimate partner violence during pregnancy (OR = 1.47; 95% CI 1.01 - 2.12). The intimate partner violence during pregnancy was frequent in women who reported more episodes of physical violence in childhood (OR = 2.08; 95%CI 1.43-3.02) and adolescence (OR = 1.63; 95%CI 1.07-2.47), who suffered sexual violence in childhood (OR = 3.92; 95%CI 1.86-8.27), and who perpetrated violence against the partner (OR = 8.67; 95%CI 4.57-16.45). Experiences of violence committed by parents or other family members emerge as strong risk factors for intimate partner violence in pregnancy. Identifying and understanding protective and risk factors for the emergence of intimate partner violence in pregnancy and its maintenance may help policymakers and health service managers to develop intervention strategies.

  5. The Impact of an International Cultural Experience on Previously Held Stereotypes by American Student Nurses.

    Science.gov (United States)

    Heuer, Loretta; Bengiamin, Marlene; Downey, Vicki Wessman

    2001-01-01

    Examined stereotypes held by U.S. student nurses before and after participating in an educational experience in Russia. The experience was intended to prepare them to be effective nurses in multicultural health care settings. Data from student interviews indicated that the experience changed students' stereotyped attitudes about Russian culture…

  6. RC Circuits: Some Computer-Interfaced Experiments.

    Science.gov (United States)

    Jolly, Pratibha; Verma, Mallika

    1994-01-01

    Describes a simple computer-interface experiment for recording the response of an RC network to an arbitrary input excitation. The setup is used to pose a variety of open-ended investigations in network modeling by varying the initial conditions, input signal waveform, and the circuit topology. (DDR)

  7. Volunteer computing experience with ATLAS@Home

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00068610; The ATLAS collaboration; Bianchi, Riccardo-Maria; Cameron, David; Filipčič, Andrej; Lançon, Eric; Wu, Wenjing

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  8. Volunteer Computing Experience with ATLAS@Home

    CERN Document Server

    Cameron, David; The ATLAS collaboration; Bourdarios, Claire; Lan\\c con, Eric

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers' resources make up a sizable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one job to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  9. Volunteer Computing Experience with ATLAS@Home

    Science.gov (United States)

    Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.; ATLAS Collaboration

    2017-10-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  10. Computational Experiments for Science and Engineering Education

    Science.gov (United States)

    Xie, Charles

    2011-01-01

    How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.

  11. Previous Experiences with Epilepsy and Effectiveness of Information to Change Public Perception of Epilepsy

    NARCIS (Netherlands)

    Gutteling, Jan M.; Seydel, E.R.; Wiegman, O.

    1986-01-01

    Differences with regard to the effectiveness of health information and attitude change are suggested between people with direct, behavioral experiences with a health topic and people with indirect, nonbehavioral experiences. The effects of three different methods of health education about epilepsy,

  12. Study of some physical aspects previous to design of an exponential experiment

    International Nuclear Information System (INIS)

    Caro, R.; Francisco, J. L. de

    1961-01-01

    This report presents the theoretical study of some physical aspects previous to the design of an exponential facility. The are: Fast and slow flux distribution in the multiplicative medium and in the thermal column, slowing down in the thermal column, geometrical distribution and minimum needed intensity of sources access channels and perturbations produced by possible variations in its position and intensity. (Author) 4 refs

  13. Bevacizumab plus chemotherapy in elderly patients with previously untreated metastatic colorectal cancer: single center experience

    Directory of Open Access Journals (Sweden)

    Ocvirk Janja

    2016-06-01

    Full Text Available Metastatic colorectal cancer (mCRC is mainly a disease of elderly, however, geriatric population is underrepresented in clinical trials. Patient registries represent a tool to assess and follow treatment outcomes in this patient population. The aim of the study was with the help of the patients’ register to determine the safety and efficacy of bevacizumab plus chemotherapy in elderly patients who had previously untreated metastatic colorectal cancer.

  14. The Impact of Previous Online Course Experience RN Students' Perceptions of Quality

    Science.gov (United States)

    Hixon, Emily; Barczyk, Casimir; Ralston-Berg, Penny; Buckenmeyer, Janet

    2016-01-01

    The purpose of this paper is to explore whether experienced online students (who have completed seven or more online courses) perceive the quality of their courses differently than novice online students (who have completed three or fewer online courses) or students with an intermediate level of online course experience (those who have completed…

  15. Is the ability to perform transurethral resection of the prostate influenced by the surgeon's previous experience?

    Directory of Open Access Journals (Sweden)

    José Cury

    2008-01-01

    Full Text Available PURPOSE: To evaluate the influence of the urologist's experience on the surgical results and complications of transurethral resection of the prostate (TURP. PATIENTS AND METHODS: Sixty-seven patients undergoing transurethral resection of the prostate without the use of a video camera were randomly allocated into three groups according to the urologist's experience: a urologist having done 25 transurethral resections of the prostate (Group I - 24 patients; a urologist having done 50 transurethral resections of the prostate (Group II - 24 patients; a senior urologist with vast transurethral resection of the prostate experience (Group III - 19 patients. The following were recorded: the weight of resected tissue, the duration of the resection procedure, the volume of irrigation used, the amount of irrigation absorbed and the hemoglobin and sodium levels in the serum during the procedure. RESULTS: There were no differences between the groups in the amount of irrigation fluid used per operation, the amount of irrigation fluid absorbed or hematocrit and hemoglobin variation during the procedure. The weight of resected tissue per minute was approximately four times higher in group III than in groups I and II. The mean absorbed irrigation fluid was similar between the groups, with no statistical difference between them (p=0.24. Four patients (6% presented with TUR syndrome, without a significant difference between the groups. CONCLUSION: The senior urologist was capable of resecting four times more tissue per time unit than the more inexperienced surgeons. Therefore, a surgeon's experience may be important to reduce the risk of secondary TURP due to recurring adenomas or adenomas that were incompletely resected. However, the incidence of complications was the same between the three groups.

  16. Computed tomography in the evaluation of abdominal fat distribution associated with a hyperlipidic diet in previously undernourished rats

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Carlos Alberto Soares da [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Faculdade de Ciencias Medicas. Program of Post-graduation in Clinical and Experimental Physiopathology; Alves, Erika Gomes; Gonzalez, Gabriele Paula; Barbosa, Thais Barcellos Cortez; Lima, Veronica Demarco; Nascimento, Renata; Moura, Egberto Gaspar de; Saba, Celly Cristina Alves do Nascimento [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Inst. de Biologia Roberto Alcantara Gomes. Dept. of Physiological Sciences]. E-mail: cellysaba@terra.com.br; Monteiro, Alexandra Maria Vieira [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Faculdade de Ciencias Medicas

    2007-09-15

    Objective: To study, by means of computed tomography, the repercussion of post-weaning dietary supplementation with soy oil or canola oil on the abdominal fat distribution in previously undernourished rats. Materials and methods: Dams submitted to 50% food restriction (FR) compared with dams receiving a standard diet (C). After weaning, undernourished rats received a diet supplemented with 19% soy oil (19% FR-soy) or 19% canola oil (19% FR-canola). Rats in the control group received a diet with 7% soy oil (7% C-soy) until the end of the experimental period. At the age of 60 days old, the rats were submitted to computed tomography for evaluation of total abdominal and visceral fat area. The rats' length and body mass were evaluated and, after their sacrifice, the abdominal fat depots were excised weighted. The data are reported as mean {+-} mean standard error, with p < 0.05 considered as significance level. Results: Rats in the group 19% FR presented similar length, body weight and visceral fat mass. As a whole, the evaluations have shower results significantly lower in relation to the control group (7% C-soy). However, computed tomography has found significant differences in abdominal fat distribution for the groups 19% FR-soy and 19% FR-canola. Conclusion: Computed tomography has demonstrated that the abdominal fat distribution may be dependent on the type of vegetable oil included in the diet. (author)

  17. Computer models experiences in radiological safety

    International Nuclear Information System (INIS)

    Ferreri, J.C.; Grandi, G.M.; Ventura, M.A.; Doval, A.S.

    1989-01-01

    A review in the formulation and use of numerical methods in fluid dynamics and heat and mass transfer in nuclear safety is presented. A wide range of applications is covered, namely: nuclear reactor's thermohydraulics, natural circulation in closed loops, experiments for the validation of numerical methods, thermohydraulics of fractured-porous media and radionuclide migration. The results of the experience accumulated is a research line dealing at the present with moving grids in computational fluid dynamics and the use of artificial intelligence techniques. As a consequence some recent experience in the development of expert systems and the considerations that should be taken into account for its use in radiological safety is also reviewed. (author)

  18. Performing quantum computing experiments in the cloud

    Science.gov (United States)

    Devitt, Simon J.

    2016-09-01

    Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.

  19. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    Science.gov (United States)

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  20. Bevacizumab plus chemotherapy in elderly patients with previously untreated metastatic colorectal cancer: single center experience

    International Nuclear Information System (INIS)

    Ocvirk, Janja; Moltara, Maja Ebert; Mesti, Tanja; Boc, Marko; Rebersek, Martina; Volk, Neva; Benedik, Jernej; Hlebanja, Zvezdana

    2016-01-01

    Metastatic colorectal cancer (mCRC) is mainly a disease of elderly, however, geriatric population is underrepresented in clinical trials. Patient registries represent a tool to assess and follow treatment outcomes in this patient population. The aim of the study was with the help of the patients’ register to determine the safety and efficacy of bevacizumab plus chemotherapy in elderly patients who had previously untreated metastatic colorectal cancer. The registry of patients with mCRC was designed to prospectively evaluate the safety and efficacy of bevacizumab-containing chemotherapy as well as selection of patients in routine clinical practice. Patient baseline clinical characteristics, pre-specified bevacizumab-related adverse events, and efficacy data were collected, evaluated and compared according to the age categories. Between January 2008 and December 2010, 210 patients with mCRC (median age 63, male 61.4%) started bevacizumab-containing therapy in the 1 st line setting. Majority of the 210 patients received irinotecan-based chemotherapy (68%) as 1 st line treatment and 105 patients (50%) received bevacizumab maintenance therapy. Elderly (≥ 70 years) patients presented 22.9% of all patients and they had worse performance status (PS 1/2, 62.4%) than patients in < 70 years group (PS 1/2, 35.8%). Difference in disease control rate was mainly due to inability to assess response in elderly group (64.6% in elderly and 77.8% in < 70 years group, p = 0.066). The median progression free survival was 10.2 (95% CI, 6.7–16.2) and 11.3 (95% CI, 10.2–12.6) months in elderly and < 70 years group, respectively (p = 0.58). The median overall survival was 18.5 (95% CI, 12.4–28.9) and 27.4 (95% CI, 22.7–31.9) months for elderly and < 70 years group, respectively (p = 0.03). Three-year survival rate was 26% and 37.6% in elderly vs. < 70 years group (p = 0.03). Overall rates of bevacizumab-related adverse events were similar in both groups: proteinuria 21

  1. Frequency and clinical significance of previously undetected incidental findings detected on computed tomography simulation scans for breast cancer patients.

    Science.gov (United States)

    Nakamura, Naoki; Tsunoda, Hiroko; Takahashi, Osamu; Kikuchi, Mari; Honda, Satoshi; Shikama, Naoto; Akahane, Keiko; Sekiguchi, Kenji

    2012-11-01

    To determine the frequency and clinical significance of previously undetected incidental findings found on computed tomography (CT) simulation images for breast cancer patients. All CT simulation images were first interpreted prospectively by radiation oncologists and then double-checked by diagnostic radiologists. The official reports of CT simulation images for 881 consecutive postoperative breast cancer patients from 2009 to 2010 were retrospectively reviewed. Potentially important incidental findings (PIIFs) were defined as any previously undetected benign or malignancy-related findings requiring further medical follow-up or investigation. For all patients in whom a PIIF was detected, we reviewed the clinical records to determine the clinical significance of the PIIF. If the findings from the additional studies prompted by a PIIF required a change in management, the PIIF was also recorded as a clinically important incidental finding (CIIF). There were a total of 57 (6%) PIIFs. The 57 patients in whom a PIIF was detected were followed for a median of 17 months (range, 3-26). Six cases of CIIFs (0.7% of total) were detected. Of the six CIIFs, three (50%) cases had not been noted by the radiation oncologist until the diagnostic radiologist detected the finding. On multivariate analysis, previous CT examination was an independent predictor for PIIF (p = 0.04). Patients who had not previously received chest CT examinations within 1 year had a statistically significantly higher risk of PIIF than those who had received CT examinations within 6 months (odds ratio, 3.54; 95% confidence interval, 1.32-9.50; p = 0.01). The rate of incidental findings prompting a change in management was low. However, radiation oncologists appear to have some difficulty in detecting incidental findings that require a change in management. Considering cost, it may be reasonable that routine interpretations are given to those who have not received previous chest CT examinations within 1 year

  2. The relationship between emotional intelligence, previous caring experience and successful completion of a pre-registration nursing/midwifery degree.

    Science.gov (United States)

    Snowden, Austyn; Stenhouse, Rosie; Duers, Lorraine; Marshall, Sarah; Carver, Fiona; Brown, Norrie; Young, Jenny

    2018-02-01

    To examine the relationship between baseline emotional intelligence and prior caring experience with completion of pre-registration nurse and midwifery education. Selection and retention of nursing students is a global challenge. Emotional intelligence is well-conceptualized, measurable and an intuitive prerequisite to nursing values and so might be a useful selection criterion. Previous caring experience may also be associated with successful completion of nurse training. Prospective longitudinal study. Self-report trait and ability emotional intelligence scores were obtained from 876 student nurses from two Scottish Universities before they began training in 2013. Data on previous caring experience were recorded. Relationships between these metrics and successful completion of the course were calculated in SPSS version 23. Nurses completing their programme scored significantly higher on trait emotional intelligence than those that did not complete their programme. Nurses completing their programme also scored significantly higher on social connection scores than those that did not. There was no relationship between "ability" emotional intelligence and completion. Previous caring experience was not statistically significantly related to completion. Students with higher baseline trait emotional intelligence scores were statistically more likely to complete training than those with lower scores. This relationship also held using "Social connection" scores. At best, previous caring experience made no difference to students' chances of completing training. Caution is urged when interpreting these results because the headline findings mask considerable heterogeneity. Neither previous caring experience or global emotional intelligence measures should be used in isolation to recruit nurses. © 2017 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  3. Previous International Experience, Cross-Cultural Training, and Expatriates' Cross-Cultural Adjustment: Effects of Cultural Intelligence and Goal Orientation

    Science.gov (United States)

    Koo Moon, Hyoung; Kwon Choi, Byoung; Shik Jung, Jae

    2012-01-01

    Although various antecedents of expatriates' cross-cultural adjustment have been addressed, previous international experience, predeparture cross-cultural training, and cultural intelligence (CQ) have been most frequently examined. However, there are few attempts that explore the effects of these antecedents simultaneously or consider the possible…

  4. Pain related to mandibular block injections and its relationship with anxiety and previous experiences with dental anesthetics

    NARCIS (Netherlands)

    van Wijk, A.; Lindeboom, J.A.; de Jongh, A.; Tuk, J.G.; Hoogstraten, J.

    2012-01-01

    Objective. Anesthetic injections should reassure patients with the prospect of painless treatment, but for some patients it is the main source of their fear. We investigated pain resulting from mandibular block injections in relation to anxiety and previous experience with receiving injections.

  5. The Effect of Previous Co-Worker Experience on the Survival of Knowledge Intensive Start-Ups

    DEFF Research Database (Denmark)

    Timmermans, Bram

    The aim of the paper is to investigate the effect of previous co-worker experience on the survival of knowledge intensive start-ups. For the empirical analysis I use the Danish Integrated Database of Labor Market Research (IDA). This longitudinal employer-employee database allows me to identify co-worker...... experience among all members of the firm. In addition, I will make a distinction between ordinary start-ups and entrepreneurial spin-offs. The results show that previous co-worker experience has a positive effect on new firm survival. This effect appears to be valid predominantly for ordinary start-ups than...

  6. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  7. Remote Access of Computer Controlled Experiments

    Directory of Open Access Journals (Sweden)

    Kristian Nilsson

    2008-11-01

    Full Text Available Abstract—in this paper, we present a way for students to access and operate laboratory equipment, controlled by a laboratory computer via a remote access program. In this way, the solution is not dependent on the specific laboratory equipment, as long as the equipment can be remotely controlled. The system can easily be altered to be used in another laboratory setup. Students are able to make reservations of experiment sessions through a web interface, which is administrated by the system administrator. The solution proposed in this paper is one way to speed up the development of remote accessible laboratories. Most of the proposed solution is based on open source software and the hardware is built on ordinary consumer parts, which makes the proposed remote laboratory architecture cost effective.

  8. Amorphous nanoparticles — Experiments and computer simulations

    International Nuclear Information System (INIS)

    Hoang, Vo Van; Ganguli, Dibyendu

    2012-01-01

    The data obtained by both experiments and computer simulations concerning the amorphous nanoparticles for decades including methods of synthesis, characterization, structural properties, atomic mechanism of a glass formation in nanoparticles, crystallization of the amorphous nanoparticles, physico-chemical properties (i.e. catalytic, optical, thermodynamic, magnetic, bioactivity and other properties) and various applications in science and technology have been reviewed. Amorphous nanoparticles coated with different surfactants are also reviewed as an extension in this direction. Much attention is paid to the pressure-induced polyamorphism of the amorphous nanoparticles or amorphization of the nanocrystalline counterparts. We also introduce here nanocomposites and nanofluids containing amorphous nanoparticles. Overall, amorphous nanoparticles exhibit a disordered structure different from that of corresponding bulks or from that of the nanocrystalline counterparts. Therefore, amorphous nanoparticles can have unique physico-chemical properties differed from those of the crystalline counterparts leading to their potential applications in science and technology.

  9. Previous experiences and emotional baggage as barriers to lifestyle change - a qualitative study of Norwegian Healthy Life Centre participants.

    Science.gov (United States)

    Følling, Ingrid S; Solbjør, Marit; Helvik, Anne-S

    2015-06-23

    Changing lifestyle is challenging and difficult. The Norwegian Directorate of Health recommends that all municipalities establish Healthy Life Centres targeted to people with lifestyle issues. Little is known about the background, experiences and reflections of participants. More information is needed about participants to shape effective lifestyle interventions with lasting effect. This study explores how participants in a lifestyle intervention programme describe previous life experiences in relation to changing lifestyle. Semi-structured qualitative in-depth interviews were performed with 23 participants (16 women and 7 men) aged 18 - 70 years. The data were analysed using systematic text condensation searching for issues describing participants' responses, and looking for the essence, aiming to share the basis of life-world experiences as valid knowledge. Participants identified two main themes: being stuck in old habits, and being burdened with emotional baggage from their previous negative experiences. Participants expressed a wish to change their lifestyles, but were unable to act in accordance with the health knowledge they possessed. Previous experiences with lifestyle change kept them from initiating attempts without professional assistance. Participants also described being burdened by an emotional baggage with problems from childhood and/or with family, work and social life issues. Respondents said that they felt that emotional baggage was an important explanation for why they were stuck in old habits and that conversely, being stuck in old habits added load to their already emotional baggage and made it heavier. Behavioural change can be hard to perform as psychological distress from life baggage can influence the ability to change. The study participants' experience of being stuck in old habits and having substantial emotional baggage raises questions as to whether or not Healthy Life Centres are able to help participants who need to make a lifestyle

  10. Relationship between premature loss of primary teeth with oral hygiene, consumption of soft drinks, dental care, and previous caries experience.

    Science.gov (United States)

    López-Gómez, Sandra Aremy; Villalobos-Rodelo, Juan José; Ávila-Burgos, Leticia; Casanova-Rosado, Juan Fernando; Vallejos-Sánchez, Ana Alicia; Lucas-Rincón, Salvador Eduardo; Patiño-Marín, Nuria; Medina-Solís, Carlo Eduardo

    2016-02-26

    We determine the relationship between premature loss of primary teeth and oral hygiene, consumption of soft drinks, dental care and previous caries experience. This study focused on 833 Mexican schoolchildren aged 6-7. We performed an oral examination to determine caries experience and the simplified oral hygiene index. The dependent variable was the prevalence of at least one missing tooth (or indicated for extraction) of the primary dentition; this variable was coded as 0 = no loss of teeth and 1 = at least one lost primary tooth. The prevalence of at least one missing tooth was 24.7% (n = 206) (95% CI = 21.8-27.7). The variables that were associated with the prevalence of tooth loss (p oral hygiene (OR = 3.24), a lower frequency of brushing (OR = 1.60), an increased consumption of soda (OR = 1.89) and use of dental care (curative: OR = 2.83, preventive: OR = 1.93). This study suggests that the premature loss of teeth in the primary dentition is associated with oral hygiene, consumption of soft drinks, dental care and previous caries experience in Mexican schoolchildren. These data provide relevant information for the design of preventive dentistry programs.

  11. A 20-year experience with liver transplantation for polycystic liver disease: does previous palliative surgical intervention affect outcomes?

    Science.gov (United States)

    Baber, John T; Hiatt, Jonathan R; Busuttil, Ronald W; Agopian, Vatche G

    2014-10-01

    Although it is the only curative treatment for polycystic liver disease (PLD), orthotopic liver transplantation (OLT) has been reserved for severely symptomatic, malnourished, or refractory patients who are not candidates for palliative disease-directed interventions (DDI). Data on the effect of previous DDIs on post-transplant morbidity and mortality are scarce. We analyzed the outcomes after OLT for PLD recipients, and determined the effects of previous palliative surgical intervention on post-transplantation morbidity and mortality. We performed a retrospective analysis of factors affecting perioperative outcomes after OLT for PLD between 1992 and 2013, including comparisons of recipients with previous major open DDIs (Open DDI, n = 12) with recipients with minimally invasive or no previous DDIs (minimal DDI, n = 16). Over the 20-year period, 28 recipients underwent OLT for PLD, with overall 30-day, 1-, and 5-year graft and patient survivals of 96%, 89%, 75%, and 96%, 93%, 79%, respectively. Compared with the minimal DDI group, open DDI recipients accounted for all 5 deaths, had inferior 90-day and 1- and 5-year survivals (83%, 83%, and 48% vs 100%, 100%, 100%; p = 0.009), and greater intraoperative (42% vs 0%; p = 0.003), total (58% vs 19%; p = 0.031), and Clavien grade IV or greater (50% vs 6%; p = 0.007) postoperative complications, more unplanned reoperations (50% vs 13%; p = 0.003), and longer total hospital (27 days vs 17 days; p = 0.035) and ICU (10 days vs 4 days; p = 0.045) stays. In one of the largest single-institution experiences of OLT for PLD, we report excellent long-term graft and patient survival. Previous open DDIs are associated with increased risks of perioperative morbidity and mortality. Improved identification of PLD patients bound for OLT may mitigate perioperative complications and potentially improve post-transplantation outcomes. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Computational Certification under Limited Experiments (Preprint)

    Science.gov (United States)

    2017-04-26

    periodic microstructures," Computers & amp ; Structures, 82(7), pp. 593-606. [53] Smit, R., Brekelmans, W., and Meijer, H., 1998, "Prediction of the...55(11), pp. 1285-1322. [58] Zohdi, T. I., and Wriggers, P., 2008, An introduction to computational micromechanics, Springer Science & amp

  13. Does Previous Experience of Floods Stimulate the Adoption of Coping Strategies? Evidence from Cross Sectional Surveys in Nigeria and Tanzania

    Directory of Open Access Journals (Sweden)

    Sheila A. Boamah

    2015-11-01

    Full Text Available In sub-Saharan Africa, hydro-meteorological related disasters, such as floods, account for the majority of the total number of natural disasters. Over the past century, floods have affected 38 million people, claimed several lives and caused substantial economic losses in the region. The goal of this paper is to examine how personality disposition, social network, and socio-demographic factors mitigate the complex relationship between stressful life experiences of floods and ocean surges and the adoption of coping strategies among coastal communities in Nigeria and Tanzania. Generalized linear models (GLM were fitted to cross-sectional survey data on 1003 and 1253 individuals in three contiguous coastal areas in Nigeria and Tanzania, respectively. Marked differences in the type of coping strategies were observed across the two countries. In Tanzania, the zero-order relationships between adoption of coping strategies and age, employment and income disappeared at the multivariate level. Only experience of floods in the past year and social network resources were significant predictors of participants’ adoption of coping strategies, unlike in Nigeria, where a plethora of factors such as experience of ocean surges in the past one year, personality disposition, age, education, experience of flood in the past one year, ethnicity, income, housing quality and employment status were still statistically significant at the multivariate level. Our findings suggest that influence of previous experience on adoption of coping strategies is spatially ubiquitous. Consequently, context-specific policies aimed at encouraging the adoption of flood-related coping strategies in vulnerable locations should be designed based on local needs and orientation.

  14. Automated assessment of heart chamber volumes and function in patients with previous myocardial infarction using multidetector computed tomography

    DEFF Research Database (Denmark)

    Fuchs, Andreas; Kühl, Jørgen Tobias; Lønborg, Jacob

    2013-01-01

    Left ventricular (LV), right ventricular (RV), and left atrial (LA) volumes and functions contain important prognostic information in ischemic heart disease. Because multidetector computed tomography (MDCT) has high spatial resolution, this method may be optimal to obtain this information....

  15. [Analysis of single-photon emission computed tomography in patients with hypertensive encephalopathy complicated with previous hypertensive crisis].

    Science.gov (United States)

    Kustkova, H S

    2012-01-01

    In cerebrovascular diseases pefuzionnaya single photon emission computed tomography with lipophilic amines used for the diagnosis of functional disorders of cerebral blood flow. Quantitative calculations helps clarify the nature of vascular disease and clarify the adequacy and effectiveness of the treatment. In this modern program for SPECT ensure conduct not only as to the calculation of blood flow, but also make it possible to compute also the absolute values of cerebral blood flow.

  16. Response Surface Model Building Using Orthogonal Arrays for Computer Experiments

    Science.gov (United States)

    Unal, Resit; Braun, Robert D.; Moore, Arlene A.; Lepsch, Roger A.

    1997-01-01

    This study investigates response surface methods for computer experiments and discusses some of the approaches available. Orthogonal arrays constructed for computer experiments are studied and an example application to a technology selection and optimization study for a reusable launch vehicle is presented.

  17. The Impact of Previous Action on Bargaining—An Experiment on the Emergence of Preferences for Fairness Norms

    Directory of Open Access Journals (Sweden)

    Thomas Neumann

    2017-08-01

    Full Text Available The communication of participants to identify an acceptable bargaining outcome in the Nash bargaining game is all about fairness norms. Participants introduce fairness norms which yield a better outcome for themselves in order to convince the other participant of their bargaining proposal. Typically, these fairness norms are in line with theoretical predictions, which support a wide variety of different but fair outcomes the participants can choose from. In this experiment, we play two treatments of the Nash bargaining game: in one treatment, the participants play a dictator game prior to bargaining, and in the other treatment they do not. We find that participants who have not played the dictator game intensively discuss the outcome of the game and come to solutions closer to the equal split of the pie the longer they chat. This effect vanishes as soon as the participants have previous experience from a dictator game: instead of chatting, they establish the fairness norm introduced in the dictator game. Remarkably, if the dictator is unfair in the dictator game, he also gets a higher share of the pie in the Nash bargaining game.

  18. The Affective Experience of Novice Computer Programmers

    Science.gov (United States)

    Bosch, Nigel; D'Mello, Sidney

    2017-01-01

    Novice students (N = 99) participated in a lab study in which they learned the fundamentals of computer programming in Python using a self-paced computerized learning environment involving a 25-min scaffolded learning phase and a 10-min unscaffolded fadeout phase. Students provided affect judgments at approximately 100 points (every 15 s) over the…

  19. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  20. Computing in support of experiments at LAMPF

    International Nuclear Information System (INIS)

    Thomas, R.F.; Amann, J.F.; Butler, H.S.

    1976-10-01

    This report documents the discussions and conclusions of a study, conducted in August 1976, of the requirements for computer support of the experimental program in medium-energy physics at the Clinton P. Anderson Meson Physics Facility. 1 figure, 1 table

  1. Electromagnetic Induction: A Computer-Assisted Experiment

    Science.gov (United States)

    Fredrickson, J. E.; Moreland, L.

    1972-01-01

    By using minimal equipment it is possible to demonstrate Faraday's Law. An electronic desk calculator enables sophomore students to solve a difficult mathematical expression for the induced EMF. Polaroid pictures of the plot of induced EMF, together with the computer facility, enables students to make comparisons. (PS)

  2. Influence of Previous Crop on Durum Wheat Yield and Yield Stability in a Long-term Experiment

    Directory of Open Access Journals (Sweden)

    Anna Maria Stellacci

    2011-02-01

    Full Text Available Long-term experiments are leading indicators of sustainability and serve as an early warning system to detect problems that may compromise future productivity. So the stability of yield is an important parameter to be considered when judging the value of a cropping system relative to others. In a long-term rotation experiment set up in 1972 the influence of different crop sequences on the yields and on yield stability of durum wheat (Triticum durum Desf. was studied. The complete field experiment is a split-split plot in a randomized complete block design with two replications; the whole experiment considers three crop sequences: 1 three-year crop rotation: sugar-beet, wheat + catch crop, wheat; 2 one-year crop rotation: wheat + catch crop; 3 wheat continuous crop; the split treatments are two different crop residue managements; the split-split plot treatments are 18 different fertilization formulas. Each phase of every crop rotation occurred every year. In this paper only one crop residue management and only one fertilization treatment have been analized. Wheat crops in different rotations are coded as follows: F1: wheat after sugar-beet in three-year crop rotation; F2: wheat after wheat in three-year crop rotation; Fc+i: wheat in wheat + catch crop rotation; Fc: continuous wheat. The following two variables were analysed: grain yield and hectolitre weight. Repeated measures analyses of variance and stability analyses have been perfomed for the two variables. The stability analysis was conducted using: three variance methods, namely the coefficient of variability of Francis and Kannenberg, the ecovalence index of Wricke and the stability variance index of Shukla; the regression method of Eberhart and Russell; a method, proposed by Piepho, that computes the probability of one system outperforming another system. It has turned out that each of the stability methods used has enriched of information the simple variance analysis. The Piepho

  3. Experiment Dashboard for Monitoring of the LHC Distributed Computing Systems

    International Nuclear Information System (INIS)

    Andreeva, J; Campos, M Devesas; Cros, J Tarragon; Gaidioz, B; Karavakis, E; Kokoszkiewicz, L; Lanciotti, E; Maier, G; Ollivier, W; Nowotka, M; Rocha, R; Sadykov, T; Saiz, P; Sargsyan, L; Sidorova, I; Tuckett, D

    2011-01-01

    LHC experiments are currently taking collisions data. A distributed computing model chosen by the four main LHC experiments allows physicists to benefit from resources spread all over the world. The distributed model and the scale of LHC computing activities increase the level of complexity of middleware, and also the chances of possible failures or inefficiencies in involved components. In order to ensure the required performance and functionality of the LHC computing system, monitoring the status of the distributed sites and services as well as monitoring LHC computing activities are among the key factors. Over the last years, the Experiment Dashboard team has been working on a number of applications that facilitate the monitoring of different activities: including following up jobs, transfers, and also site and service availabilities. This presentation describes Experiment Dashboard applications used by the LHC experiments and experience gained during the first months of data taking.

  4. Using sobol sequences for planning computer experiments

    Science.gov (United States)

    Statnikov, I. N.; Firsov, G. I.

    2017-12-01

    Discusses the use for research of problems of multicriteria synthesis of dynamic systems method of Planning LP-search (PLP-search), which not only allows on the basis of the simulation model experiments to revise the parameter space within specified ranges of their change, but also through special randomized nature of the planning of these experiments is to apply a quantitative statistical evaluation of influence of change of varied parameters and their pairwise combinations to analyze properties of the dynamic system.Start your abstract here...

  5. Model and Computing Experiment for Research and Aerosols Usage Management

    Directory of Open Access Journals (Sweden)

    Daler K. Sharipov

    2012-09-01

    Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.

  6. A Computational Experiment on Single-Walled Carbon Nanotubes

    Science.gov (United States)

    Simpson, Scott; Lonie, David C.; Chen, Jiechen; Zurek, Eva

    2013-01-01

    A computational experiment that investigates single-walled carbon nanotubes (SWNTs) has been developed and employed in an upper-level undergraduate physical chemistry laboratory course. Computations were carried out to determine the electronic structure, radial breathing modes, and the influence of the nanotube's diameter on the…

  7. The Information Science Experiment System - The computer for science experiments in space

    Science.gov (United States)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  8. Computed tomographic identification of dysplasia and progression of osteoarthritis in dog elbows previously assigned OFA grades 0 and 1.

    Science.gov (United States)

    Kunst, Chelsea M; Pease, Anthony P; Nelson, Nathan C; Habing, Greg; Ballegeer, Elizabeth A

    2014-01-01

    Elbow dysplasia is a heritable disease that is a common cause of lameness and progressive elbow osteoarthritis in young large breed dogs. The Orthopedic Foundation for Animals (OFA) screens elbow radiographs, and assigns grades 0-3 based on presence and severity of bony proliferation on the anconeal process. Grade 1 is assigned when less than 3 mm is present and considered positive for dysplasia. We investigated the incidence of elbow dysplasia and progression of osteoarthritis in elbows with grades 0 and 1 in 46 elbows screened at least 1 year previously, using CT as a gold standard and with the addition of CT absorptiometry. The incidence of dysplasia based on CT was 62% in grade 0, and 75% in grade 1 elbows, all of which had medial coronoid disease. Progressive osteoarthritis at recheck was consistent with elbow dysplasia. The sensitivity and specificity of the OFA grade for elbow dysplasia compared to CT findings was 75% and 38%, respectively. Increased bone mineral density of the medial coronoid process as characterized by osteoabsorptiometry warrants further investigation with respect to elbow dysplasia. Proliferation on the anconeal process without CT evidence of dysplasia or osteoarthritis was present in 20% of the elbows, and is theorized to be an anatomic variant or enthesopathy of the olecranon ligament/synovium. Results of our study suggest that the "anconeal bump" used for elbow screening by the OFA is a relatively insensitive characteristic, and support the use of CT for identifying additional characteristics of elbow dysplasia. © 2014 American College of Veterinary Radiology.

  9. Methodological Potential of Computer Experiment in Teaching Mathematics at University

    Science.gov (United States)

    Lin, Kequan; Sokolova, Anna Nikolaevna; Vlasova, Vera K.

    2017-01-01

    The study is relevant due to the opportunity of increasing efficiency of teaching mathematics at university through integration of students of computer experiment conducted with the use of IT in this process. The problem of there search is defined by a contradiction between great potential opportunities of mathematics experiment for motivating and…

  10. Brookhaven Reactor Experiment Control Facility, a distributed function computer network

    International Nuclear Information System (INIS)

    Dimmler, D.G.; Greenlaw, N.; Kelley, M.A.; Potter, D.W.; Rankowitz, S.; Stubblefield, F.W.

    1975-11-01

    A computer network for real-time data acquisition, monitoring and control of a series of experiments at the Brookhaven High Flux Beam Reactor has been developed and has been set into routine operation. This reactor experiment control facility presently services nine neutron spectrometers and one x-ray diffractometer. Several additional experiment connections are in progress. The architecture of the facility is based on a distributed function network concept. A statement of implementation and results is presented

  11. Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments

    Science.gov (United States)

    Vezer, M. A.

    2010-12-01

    Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between

  12. Computer-Aided Experiment Planning toward Causal Discovery in Neuroscience

    Science.gov (United States)

    Matiasz, Nicholas J.; Wood, Justin; Wang, Wei; Silva, Alcino J.; Hsu, William

    2017-01-01

    Computers help neuroscientists to analyze experimental results by automating the application of statistics; however, computer-aided experiment planning is far less common, due to a lack of similar quantitative formalisms for systematically assessing evidence and uncertainty. While ontologies and other Semantic Web resources help neuroscientists to assimilate required domain knowledge, experiment planning requires not only ontological but also epistemological (e.g., methodological) information regarding how knowledge was obtained. Here, we outline how epistemological principles and graphical representations of causality can be used to formalize experiment planning toward causal discovery. We outline two complementary approaches to experiment planning: one that quantifies evidence per the principles of convergence and consistency, and another that quantifies uncertainty using logical representations of constraints on causal structure. These approaches operationalize experiment planning as the search for an experiment that either maximizes evidence or minimizes uncertainty. Despite work in laboratory automation, humans must still plan experiments and will likely continue to do so for some time. There is thus a great need for experiment-planning frameworks that are not only amenable to machine computation but also useful as aids in human reasoning. PMID:28243197

  13. PALSE: Python Analysis of Large Scale (Computer) Experiments

    OpenAIRE

    Cazals, Frédéric; Dreyfus, Tom; Malod-Dognin, Noël; Lhéritier, Alix

    2012-01-01

    A tenet of Science is the ability to reproduce the results, and a related issue is the possibility to archive and interpret the raw results of (computer) experiments. This paper presents an elementary python framework addressing this latter goal. Consider a computing pipeline consisting of raw data generation, raw data parsing, and data analysis i.e. graphical and statistical analysis. palse addresses these last two steps by leveraging the hierarchical structure of XML documents. More precise...

  14. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  15. The Impact of Previous Schooling Experiences on a Quaker High School's Graduating Students' College Entrance Exam Scores, Parents' Expectations, and College Acceptance Outcomes

    Science.gov (United States)

    Galusha, Debbie K.

    2010-01-01

    The purpose of the study is to determine the impact of previous private, public, home, or international schooling experiences on a Quaker high school's graduating students' college entrance composite exam scores, parents' expectations, and college attendance outcomes. The study's results suggest that regardless of previous private, public, home,…

  16. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  17. Experiences of Computer Science Curriculum Design: A Phenomenological Study

    Science.gov (United States)

    Sloan, Arthur; Bowe, Brian

    2015-01-01

    This paper presents a qualitative study of 12 computer science lecturers' experiences of curriculum design of several degree programmes during a time of transition from year-long to semesterised courses, due to institutional policy change. The background to the study is outlined, as are the reasons for choosing the research methodology. The main…

  18. Ipilimumab in the real world: the UK expanded access programme experience in previously treated advanced melanoma patients.

    Science.gov (United States)

    Ahmad, Saif S; Qian, Wendi; Ellis, Sarah; Mason, Elaine; Khattak, Muhammad A; Gupta, Avinash; Shaw, Heather; Quinton, Amy; Kovarikova, Jarmila; Thillai, Kiruthikah; Rao, Ankit; Board, Ruth; Nobes, Jenny; Dalgleish, Angus; Grumett, Simon; Maraveyas, Anthony; Danson, Sarah; Talbot, Toby; Harries, Mark; Marples, Maria; Plummer, Ruth; Kumar, Satish; Nathan, Paul; Middleton, Mark R; Larkin, James; Lorigan, Paul; Wheater, Matthew; Ottensmeier, Christian H; Corrie, Pippa G

    2015-10-01

    Before licensing, ipilimumab was first made available to previously treated advanced melanoma patients through an expanded access programme (EAP) across Europe. We interrogated data from UK EAP patients to inform future clinical practice. Clinicians registered in the UK EAP provided anonymized patient data using a prespecified variable fields datasheet. Data collected were baseline patient characteristics, treatment delivered, toxicity, response, progression-free survival and overall survival (OS). Data were received for 193 previously treated metastatic melanoma patients, whose primary sites were cutaneous (82%), uveal (8%), mucosal (2%), acral (3%) or unknown (5%). At baseline, 88% of patients had a performance status (PS) of 0-1 and 20% had brain metastases. Of the patients, 53% received all four planned cycles of ipilimumab; the most common reason for stopping early was disease progression, including death from melanoma. Toxicity was recorded for 171 patients, 30% of whom experienced an adverse event of grade 3 or higher, the most common being diarrhoea (13%) and fatigue (9%). At a median follow-up of 23 months, the median progression-free survival and OS were 2.8 and 6.1 months, respectively; the 1-year and 2-year OS rates were 31 and 14.8%, respectively. The 2-year OS was significantly lower for patients with poorer PS (P<0.0001), low albumin concentrations (P<0.0001), the presence of brain metastases (P=0.007) and lactate dehydrogenase levels more than two times the upper limit of normal (P<0.0001) at baseline. These baseline characteristics are negative predictors of benefit from ipilimumab and should be taken into consideration before prescription.

  19. The Impact of Previous Athletic Experience on Current Physical Fitness in Former Collegiate Athletes and Noncollegiate Athletes.

    Science.gov (United States)

    Simon, Janet E; Docherty, Carrie L

    Physical activity performed at moderate intensity is associated with reduced risk of mortality, cardiovascular disease, hypertension, and some types of cancers. However, vigorous physical activity during participation in college athletics may increase the risk of injury, which might limit future physical activity levels. To evaluate differences in current physical fitness levels between former Division I athletes and noncollegiate athletes. Cross-sectional study. Level 3. The sample was recruited from a large midwestern university alumni database and consisted of 2 cohorts: (1) former Division I athletes (n = 100; mean age, 53.1 ± 7.4 years) and (2) nonathletes who were active in college (n = 100; age, 51.4 ± 7.3 years). Individuals answered a demographics questionnaire and completed a physical fitness assessment consisting of 7 measures: percent body fat, 1-mile walk, sit-to-stand test, push-up, half sit-up test, sit and reach test, and back scratch test. Performance was significantly worse for former Division I athletes compared with nonathletes for percent body fat (mean difference, 7.58%; F (1, 198) = 59.91; P sit-to-stand test (mean difference, 4.3 repetitions; F (1, 198) = 6.59; P = 0.01), and push-up test (mean difference, 8.9 repetitions; F (1, 198) = 7.35; P = 0.01). Former Division I athletes may be limited because of previous injury, inhibiting their ability to stay active later in life. It is imperative that clinicians, coaches, and strength and conditioning specialists understand the possible future repercussions from competing at the Division I level.

  20. Doctors' experience with handheld computers in clinical practice: qualitative study.

    Science.gov (United States)

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-05-15

    To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.

  1. SOFTWARE TOOLS FOR COMPUTING EXPERIMENT AIMED AT MULTIVARIATE ANALYSIS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    A. V. Tyurin

    2015-09-01

    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  2. Computational Physics Undergraduate Research Experience (A case Study)

    Science.gov (United States)

    Sadaghiani, Homeyra; Samll, Alex

    2009-03-01

    There is a growing trend of inclusion of more research programs into undergraduate education. In spite of that, the assessment of undergraduate-research experience in physics is limited. This presentation describes a ten weeks undergraduate summer research experience in computational physics in the field of biophysics for two upper division physics students at Cal Poly Pomona. The analysis of Pre/post test data suggests more gains on research methodologies and skills than actual physical concepts underling the research project. We also discuss student attitude change measured by survey and interviews.

  3. Framework for emotional mobile computation for creating entertainment experience

    Science.gov (United States)

    Lugmayr, Artur R.

    2007-02-01

    Ambient media are media, which are manifesting in the natural environment of the consumer. The perceivable borders between the media and the context, where the media is used are getting more and more blurred. The consumer is moving through a digital space of services throughout his daily life. As we are developing towards an experience society, the central point in the development of services is the creation of a consumer experience. This paper reviews possibilities and potentials of the creation of entertainment experiences with mobile phone platforms. It reviews sensor network capable of acquiring consumer behavior data, interactivity strategies, psychological models for emotional computation on mobile phones, and lays the foundations of a nomadic experience society. The paper rounds up with a presentation of several different possible service scenarios in the field of entertainment and leisure computation on mobiles. The goal of this paper is to present a framework and evaluation of possibilities of applying sensor technology on mobile platforms to create an increasing consumer entertainment experience.

  4. SAMGrid experiences with the Condor technology in Run II computing

    International Nuclear Information System (INIS)

    Baranovski, A.; Loebel-Carpenter, L.; Garzoglio, G.; Herber, R.; Illingworth, R.; Kennedy, R.; Kreymer, A.; Kumar, A.; Lueking, L.; Lyon, A.; Merritt, W.; Terekhov, I.; Trumbo, J.; Veseli, S.; White, S.; St. Denis, R.; Jain, S.; Nishandar, A.

    2004-01-01

    SAMGrid is a globally distributed system for data handling and job management, developed at Fermilab for the D0 and CDF experiments in Run II. The Condor system is being developed at the University of Wisconsin for management of distributed resources, computational and otherwise. We briefly review the SAMGrid architecture and its interaction with Condor, which was presented earlier. We then present our experiences using the system in production, which have two distinct aspects. At the global level, we deployed Condor-G, the Grid-extended Condor, for the resource brokering and global scheduling of our jobs. At the heart of the system is Condor's Matchmaking Service. As a more recent work at the computing element level, we have been benefiting from the large computing cluster at the University of Wisconsin campus. The architecture of the computing facility and the philosophy of Condor's resource management have prompted us to improve the application infrastructure for D0 and CDF, in aspects such as parting with the shared file system or reliance on resources being dedicated. As a result, we have increased productivity and made our applications more portable and Grid-ready. Our fruitful collaboration with the Condor team has been made possible by the Particle Physics Data Grid

  5. Introduction to optical computing applied to high energy physics experiments

    International Nuclear Information System (INIS)

    Metzger, G.

    1978-01-01

    The topic covered is 'Electro-optical means of implementation of a special-purpose processor dedicated to high-energy physics experiments'. Basic principles of optical computing are given and some means to overcome the limitations of the techniques are treated: analysis of real time electro-optical transducers in respect of their matching to high energy detectors, special complex filtering dedicated to pattern analysis. (Auth.)

  6. Multilink manipulator computer control: experience in development and commissioning

    International Nuclear Information System (INIS)

    Holt, J.E.

    1988-11-01

    This report describes development which has been carried out on the multilink manipulator computer control system. The system allows the manipulator to be driven using only two joysticks. The leading link is controlled and the other links follow its path into the reactor, thus avoiding any potential obstacles. The system has been fully commissioned and used with the Sizewell ''A'' reactor 2 Multilink T.V. manipulator. Experience of the use of the system is presented, together with recommendations for future improvements. (author)

  7. Unsteady Thick Airfoil Aerodynamics: Experiments, Computation, and Theory

    Science.gov (United States)

    Strangfeld, C.; Rumsey, C. L.; Mueller-Vahl, H.; Greenblatt, D.; Nayeri, C. N.; Paschereit, C. O.

    2015-01-01

    An experimental, computational and theoretical investigation was carried out to study the aerodynamic loads acting on a relatively thick NACA 0018 airfoil when subjected to pitching and surging, individually and synchronously. Both pre-stall and post-stall angles of attack were considered. Experiments were carried out in a dedicated unsteady wind tunnel, with large surge amplitudes, and airfoil loads were estimated by means of unsteady surface mounted pressure measurements. Theoretical predictions were based on Theodorsen's and Isaacs' results as well as on the relatively recent generalizations of van der Wall. Both two- and three-dimensional computations were performed on structured grids employing unsteady Reynolds-averaged Navier-Stokes (URANS). For pure surging at pre-stall angles of attack, the correspondence between experiments and theory was satisfactory; this served as a validation of Isaacs theory. Discrepancies were traced to dynamic trailing-edge separation, even at low angles of attack. Excellent correspondence was found between experiments and theory for airfoil pitching as well as combined pitching and surging; the latter appears to be the first clear validation of van der Wall's theoretical results. Although qualitatively similar to experiment at low angles of attack, two-dimensional URANS computations yielded notable errors in the unsteady load effects of pitching, surging and their synchronous combination. The main reason is believed to be that the URANS equations do not resolve wake vorticity (explicitly modeled in the theory) or the resulting rolled-up un- steady flow structures because high values of eddy viscosity tend to \\smear" the wake. At post-stall angles, three-dimensional computations illustrated the importance of modeling the tunnel side walls.

  8. Expertik: Experience with Artificial Intelligence and Mobile Computing

    Directory of Open Access Journals (Sweden)

    José Edward Beltrán Lozano

    2013-06-01

    Full Text Available This article presents the experience in the development of services based in Artificial Intelligence, Service Oriented Architecture, mobile computing. It aims to combine technology offered by mobile computing provides techniques and artificial intelligence through a service provide diagnostic solutions to problems in industrial maintenance. It aims to combine technology offered by mobile computing and the techniques artificial intelligence through a service to provide diagnostic solutions to problems in industrial maintenance. For service creation are identified the elements of an expert system, the knowledge base, the inference engine and knowledge acquisition interfaces and their consultation. The applications were developed in ASP.NET under architecture three layers. The data layer was developed conjunction in SQL Server with data management classes; business layer in VB.NET and the presentation layer in ASP.NET with XHTML. Web interfaces for knowledge acquisition and query developed in Web and Mobile Web. The inference engine was conducted in web service developed for the fuzzy logic model to resolve requests from applications consulting knowledge (initially an exact rule-based logic within this experience to resolve requests from applications consulting knowledge. This experience seeks to strengthen a technology-based company to offer services based on AI for service companies Colombia.

  9. Exploring the experience of clients with tetraplegia utilizing assistive technology for computer access.

    Science.gov (United States)

    Folan, Alyce; Barclay, Linda; Cooper, Cathy; Robinson, Merren

    2015-01-01

    Assistive technology for computer access can be used to facilitate people with a spinal cord injury to utilize mainstream computer applications, thereby enabling participation in a variety of meaningful occupations. The aim of this study was to gain an understanding of the experiences of clients with tetraplegia trialing assistive technologies for computer access during different stages in a public rehabilitation service. In order to explore the experiences of clients with tetraplegia trialing assistive technologies for computer use, qualitative methodology was selected. Data were collected from seven participants using semi-structured interviews, which were audio-taped, transcribed and analyzed thematically. Three main themes were identified. These were: getting back into life, assisting in adjusting to injury and learning new skills. The findings from this study demonstrated that people with tetraplegia can be assisted to return to previous life roles or engage in new roles, through developing skills in the use of assistive technology for computer access. Being able to use computers for meaningful activities contributed to the participants gaining an enhanced sense of self-efficacy, and thereby quality of life. Implications for Rehabilitation Findings from this pilot study indicate that people with tetraplegia can be assisted to return to previous life roles, and develop new roles that have meaning to them through the use of assistive technologies for computer use. Being able to use the internet to socialize, and complete daily tasks, contributed to the participants gaining a sense of control over their lives. Early introduction to assistive technology is important to ensure sufficient time for newly injured people to feel comfortable enough with the assistive technology to use the computers productively by the time of discharge. Further research into this important and expanding area is indicated.

  10. On the computer simulation of the EPR-Bohm experiment

    International Nuclear Information System (INIS)

    McGoveran, D.O.; Noyes, H.P.; Manthey, M.J.

    1988-12-01

    We argue that supraluminal correlation without supraluminal signaling is a necessary consequence of any finite and discrete model for physics. Every day, the commercial and military practice of using encrypted communication based on correlated, pseudo-random signals illustrates this possibility. All that is needed are two levels of computational complexity which preclude using a smaller system to detect departures from ''randomness'' in the larger system. Hence the experimental realizations of the EPR-Bohm experiment leave open the question of whether the world of experience is ''random'' or pseudo-random. The latter possibility could be demonstrated experimentally if a complexity parameter related to the arm length and switching time in an Aspect-type realization of the EPR-Bohm experiment is sufficiently small compared to the number of reliable total counts which can be obtained in practice. 6 refs

  11. Experiments and computation of onshore breaking solitary waves

    DEFF Research Database (Denmark)

    Jensen, A.; Mayer, Stefan; Pedersen, G.K.

    2005-01-01

    This is a combined experimental and computational study of solitary waves that break on-shore. Velocities and accelerations are measured by a two-camera PIV technique and compared to theoretical values from an Euler model with a VOF method for the free surface. In particular, the dynamics of a so......-called collapsing breaker is scrutinized and the closure between the breaker and the beach is found to be akin to slamming. To the knowledge of the authors, no velocity measurements for this kind of breaker have been previously reported....

  12. Distributing the computation in combinatorial optimization experiments over the cloud

    Directory of Open Access Journals (Sweden)

    Mario Brcic

    2017-12-01

    Full Text Available Combinatorial optimization is an area of great importance since many of the real-world problems have discrete parameters which are part of the objective function to be optimized. Development of combinatorial optimization algorithms is guided by the empirical study of the candidate ideas and their performance over a wide range of settings or scenarios to infer general conclusions. Number of scenarios can be overwhelming, especially when modeling uncertainty in some of the problem’s parameters. Since the process is also iterative and many ideas and hypotheses may be tested, execution time of each experiment has an important role in the efficiency and successfulness. Structure of such experiments allows for significant execution time improvement by distributing the computation. We focus on the cloud computing as a cost-efficient solution in these circumstances. In this paper we present a system for validating and comparing stochastic combinatorial optimization algorithms. The system also deals with selection of the optimal settings for computational nodes and number of nodes in terms of performance-cost tradeoff. We present applications of the system on a new class of project scheduling problem. We show that we can optimize the selection over cloud service providers as one of the settings and, according to the model, it resulted in a substantial cost-savings while meeting the deadline.

  13. Computational design and analysis of flatback airfoil wind tunnel experiment.

    Energy Technology Data Exchange (ETDEWEB)

    Mayda, Edward A. (University of California, Davis, CA); van Dam, C.P. (University of California, Davis, CA); Chao, David D. (University of California, Davis, CA); Berg, Dale E.

    2008-03-01

    A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

  14. Experience building and operating the CMS Tier-1 computing centres

    Science.gov (United States)

    Albert, M.; Bakken, J.; Bonacorsi, D.; Brew, C.; Charlot, C.; Huang, Chih-Hao; Colling, D.; Dumitrescu, C.; Fagan, D.; Fassi, F.; Fisk, I.; Flix, J.; Giacchetti, L.; Gomez-Ceballos, G.; Gowdy, S.; Grandi, C.; Gutsche, O.; Hahn, K.; Holzman, B.; Jackson, J.; Kreuzer, P.; Kuo, C. M.; Mason, D.; Pukhaeva, N.; Qin, G.; Quast, G.; Rossman, P.; Sartirana, A.; Scheurer, A.; Schott, G.; Shih, J.; Tader, P.; Thompson, R.; Tiradani, A.; Trunov, A.

    2010-04-01

    The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability to scale to large numbers of processing requests and large volumes of data, and the ability to provide custodial storage and high performance data serving. We will also present the operations experience utilizing the distributed Tier-1 centres from a distance: transferring data, submitting data serving requests, and submitting batch processing requests.

  15. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Laurent, Alexis; Miseljic, Mirko

    2012-01-01

    While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance...... on how to practically apply these methods are still very much under development. This paper evaluates how research efforts have applied LCA and RA together for NM, particularly reflecting on previous experiences with applying these methods to chemicals. Through a literature review and a separate analysis...... of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key ‘‘lessons learned’’ from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches...

  16. Assessment of the Relationship between Recurrent High-risk Pregnancy and Mothers’ Previous Experience of Having an Infant Admitted to a Neonatal Intensive Care Unit

    Directory of Open Access Journals (Sweden)

    Sedigheh Hantoosh Zadeh

    2015-01-01

    Full Text Available Background & aim:  High-risk pregnancies increase the risk of Intensive Care Unit (ICU and Neonatal Intensive Care Unit (NICU admission in mothers and their newborns. In this study, we aimed to identify the association between the recurrence of high-risk pregnancy and mothers’ previous experience of having an infant admitted to NICU. Methods:We performed a cohort, retrospective study to compare subsequent pregnancy outcomes among 232 control subjects and 200 female cases with a previous experience of having a newborn requiring NICU admission due to intrauterine growth retardation, preeclampsia, preterm birth, premature rupture of membranes, and asphyxia. The information about the prevalence of subsequent high-risk pregnancies was gathered via phone calls. Results: As the results indicated, heparin, progesterone, and aspirin were more frequently administered in the case group during subsequent pregnancies, compared to the control group (P

  17. Early experiences of computer-aided assessment and administration when teaching computer programming

    OpenAIRE

    Abdullah Mohd Zin; Neil Gutteridge; Eric Foxley; Edmund Burke; Steve Benford

    1993-01-01

    This paper describes early experiences with the Ceilidh system currently being piloted at over 30 institutions of higher education. Ceilidh is a course-management system for teaching computer programming whose core is an auto-assessment facility. This facility automatically marks students programs from a range of perspectives, and may be used in an iterative manner, enabling students to work towards a target level of attainment. Ceilidh also includes extensive courseadministration and progres...

  18. Monitoring the LHCb Experiment Computing Infrastructure with NAGIOS

    CERN Document Server

    Bonaccorsi, E

    2009-01-01

    LHCb has a large and complex infrastructure consisting of thousands of servers and embedded computers, hundreds of network devices and a lot of common infrastructure services such as shared storage, login and time services, databases and many others. All aspects that are operatively critic are integrated into the standard Experiment Control System (ECS) based on PVSSII. This enables non-expert operators to do first-line reactions. As the lower level and in particular for monitoring the infrastructure, the Control System itself depends on a secondary infrastructure, whose monitoring is based on NAGIOS. We present the design and implementation of the fabric management based on NAGIOS. Care has been taken to complement rather than duplicate functionality available in the Experiment Control System.

  19. Experience Building and Operating the CMS Tier-1 Computing Centres

    CERN Document Server

    Albert, M; Bonacorsi, D; Brew, C; Charlot, C; Huang, Chih-Hao; Colling, D; Dumitrescu, C; Fagan, D; Fassi, F; Fisk, I; Flix, J; Giacchetti, L; Gomez-Ceballos, G; Gowdy, S; Grandi, C; Gutsche, O; Hahn, K; Holzman, B; Jackson, J; Kreuzer, P; Kuo, C M; Mason, D; Pukhaeva, N; Qin, G; Quast, G; Rossman, P; Sartirana, A; Scheurer, A; Schott, G; Shih, J; Tader, P; Thompson, R; Tiradani, A; Trunov, A

    2010-01-01

    The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability ...

  20. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    Science.gov (United States)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  1. Patient's anxiety and fear of anesthesia: effect of gender, age, education, and previous experience of anesthesia. A survey of 400 patients.

    Science.gov (United States)

    Mavridou, Paraskevi; Dimitriou, Varvara; Manataki, Adamantia; Arnaoutoglou, Elena; Papadopoulos, Georgios

    2013-02-01

    Patients express high anxiety preoperatively, because of fears related to anesthesia and its implications. The purpose of this survey was to gain insight into these fears and to study whether they are affected by patients' sex, age, education, or previous experience of anesthesia. Questionnaires with fixed questions were distributed to consenting, consecutive surgical patients before the pre-anesthetic visit. The questionnaires included patients' demographics and questions related to their fears about anesthesia. Four-hundred questionnaires were collected and analyzed. Eighty-one percent of patients experience preoperative anxiety. The main sources of their anxiety were fear of postoperative pain (84 %), of not waking up after surgery (64.8 %), of being nauseous or vomiting (60.2 %), and of drains and needles (59.5 %). Patients are less concerned about being paralyzed because of anesthesia (33.5 %) or of revealing personal issues (18.8 %). Gender seems to affect patients fears, with women being more afraid (85.3 vs. 75.6 % of men, p = 0.014). The effects of patients' age, level of education, and previous experience of anesthesia are minor, except for individual questions. Sixty-three percent of our patients (mostly women 67.4 vs. 57.4 % of men, p = 0.039) talk about these fears with their relatives, although a vast majority of 95.5 % would prefer to talk with the anesthesiologist and be reassured by him. All patients, mostly women, express fears about anesthesia; this fear leads to preoperative anxiety. Slight differences are observed for some individual questions among patients of different sex, education level, and previous experience of anesthesia.

  2. Tactile Radar: experimenting a computer game with visually disabled.

    Science.gov (United States)

    Kastrup, Virgínia; Cassinelli, Alvaro; Quérette, Paulo; Bergstrom, Niklas; Sampaio, Eliana

    2017-09-18

    Visually disabled people increasingly use computers in everyday life, thanks to novel assistive technologies better tailored to their cognitive functioning. Like sighted people, many are interested in computer games - videogames and audio-games. Tactile-games are beginning to emerge. The Tactile Radar is a device through which a visually disabled person is able to detect distal obstacles. In this study, it is connected to a computer running a tactile-game. The game consists in finding and collecting randomly arranged coins in a virtual room. The study was conducted with nine congenital blind people including both sexes, aged 20-64 years old. Complementary methods of first and third person were used: the debriefing interview and the quasi-experimental design. The results indicate that the Tactile Radar is suitable for the creation of computer games specifically tailored for visually disabled people. Furthermore, the device seems capable of eliciting a powerful immersive experience. Methodologically speaking, this research contributes to the consolidation and development of first and third person complementary methods, particularly useful in disabled people research field, including the evaluation by users of the Tactile Radar effectiveness in a virtual reality context. Implications for rehabilitation Despite the growing interest in virtual games for visually disabled people, they still find barriers to access such games. Through the development of assistive technologies such as the Tactile Radar, applied in virtual games, we can create new opportunities for leisure, socialization and education for visually disabled people. The results of our study indicate that the Tactile Radar is adapted to the creation of video games for visually disabled people, providing a playful interaction with the players.

  3. A benchmark on computational simulation of a CT fracture experiment

    International Nuclear Information System (INIS)

    Franco, C.; Brochard, J.; Ignaccolo, S.; Eripret, C.

    1992-01-01

    For a better understanding of the fracture behavior of cracked welds in piping, FRAMATOME, EDF and CEA have launched an important analytical research program. This program is mainly based on the analysis of the effects of the geometrical parameters (the crack size and the welded joint dimensions) and the yield strength ratio on the fracture behavior of several cracked configurations. Two approaches have been selected for the fracture analyses: on one hand, the global approach based on the concept of crack driving force J and on the other hand, a local approach of ductile fracture. In this approach the crack initiation and growth are modelized by the nucleation, growth and coalescence of cavities in front of the crack tip. The model selected in this study estimates only the growth of the cavities using the RICE and TRACEY relationship. The present study deals with a benchmark on computational simulation of CT fracture experiments using three computer codes : ALIBABA developed by EDF the CEA's code CASTEM 2000 and the FRAMATOME's code SYSTUS. The paper is split into three parts. At first, the authors present the experimental procedure for high temperature toughness testing of two CT specimens taken from a welded pipe, characteristic of pressurized water reactor primary piping. Secondly, considerations are outlined about the Finite Element analysis and the application procedure. A detailed description is given on boundary and loading conditions, on the mesh characteristics, on the numerical scheme involved and on the void growth computation. Finally, the comparisons between numerical and experimental results are presented up to the crack initiation, the tearing process being not taken into account in the present study. The variations of J and of the local variables used to estimate the damage around the crack tip (triaxiality and hydrostatic stresses, plastic deformations, void growth ...) are computed as a function of the increasing load

  4. An Atomic Abacus: Trapped ion quantum computing experiments at NIST

    Science.gov (United States)

    Demarco, Brian

    2003-03-01

    Trapped atomic ions are an ideal system for exploring quantum information science because deterministic state preparation and efficient state detection are possible and coherent manipulation of atomic systems is relatively advanced. In our experiment, a few singly charged Be ions are confined by static and radio-frequency electric fields in a micro-machined linear Paul trap. The internal and motional states of the ions are coherently manipulated using applied laser light. Our current work focuses on demonstrating the necessary ingredients to produce a scalable quantum computing scheme and on simplifying and improving quantum logic gates. I will speak about a new set of experiments that was made possible by recent improvements in trap technology. A novel trap with multiple trapping regions was used to demonstrate the first steps towards a fully scalable quantum computing scheme. Single ions were ``shuttled" between trapping regions without disturbing the ion's motional and internal state, and two ions were separated from a single to two different trapping zones. Improvements in the trap manufacturing process has led to a reduction of nearly two orders of magnitude in the ion's motional heating rate, making possible two new improved logic gates. The first gate utilizes the wave-packet nature of the ions to tune the laser-atom interaction and achieve a controlled-NOT gate between a single ion's spin and motional states. The second, a two-ion phase gate, uses phase-space dynamics to produce a state-sensitive geometric phase. I will end with a quick look at experiments using a Mg ion to sympathetically cool a simultaneously trapped Be ion and a glimpse of the next generation of ions traps currently under construction.

  5. Explaining infant feeding: The role of previous personal and vicarious experience on attitudes, subjective norms, self-efficacy, and breastfeeding outcomes.

    Science.gov (United States)

    Bartle, Naomi C; Harvey, Kate

    2017-11-01

    Breastfeeding confers important health benefits to both infants and their mothers, but rates are low in the United Kingdom and other developed countries despite widespread promotion. This study examined the relationships between personal and vicarious experience of infant feeding, self-efficacy, the theory of planned behaviour variables of attitudes and subjective norm, and the likelihood of breastfeeding at 6-8 weeks post-natally. A prospective questionnaire study of both first-time mothers (n = 77) and experienced breastfeeders (n = 72) recruited at an antenatal clinic in South East England. Participants completed a questionnaire at 32 weeks pregnant assessing personal and vicarious experience of infant feeding (breastfeeding, formula-feeding, and maternal grandmother's experience of breastfeeding), perceived control, self-efficacy, intentions, attitudes (to breastfeeding and formula-feeding), and subjective norm. Infant feeding behaviour was recorded at 6-8 weeks post-natally. Multiple linear regression modelled the influence of vicarious experience on attitudes, subjective norm, and self-efficacy (but not perceived control) and modelled the influence of attitude, subjective norm, self-efficacy, and past experience on intentions to breastfeed. Logistic regression modelled the likelihood of breastfeeding at 6-8 weeks. Previous experience (particularly personal experience of breastfeeding) explained a significant amount of variance in attitudes, subjective norm, and self-efficacy. Intentions to breastfeed were predicted by subjective norm and attitude to formula-feeding and, in experienced mothers, self-efficacy. Breastfeeding at 6 weeks was predicted by intentions and vicarious experience of formula-feeding. Vicarious experience, particularly of formula-feeding, has been shown to influence the behaviour of first-time and experienced mothers both directly and indirectly via attitudes and subjective norm. Interventions that reduce exposure to formula

  6. Interdisciplinary Team-Teaching Experience for a Computer and Nuclear Energy Course for Electrical and Computer Engineering Students

    Science.gov (United States)

    Kim, Charles; Jackson, Deborah; Keiller, Peter

    2016-01-01

    A new, interdisciplinary, team-taught course has been designed to educate students in Electrical and Computer Engineering (ECE) so that they can respond to global and urgent issues concerning computer control systems in nuclear power plants. This paper discusses our experience and assessment of the interdisciplinary computer and nuclear energy…

  7. A Rural South African Experience of an ESL Computer Program

    Directory of Open Access Journals (Sweden)

    Marius Dieperink

    2008-12-01

    Full Text Available This article reports on a case study that explored the effect of an English-as-Second Language (ESL computer program at Tshwane University of Technology (TUT, South Africa. The case study explored participants’ perceptions, attitudes and beliefs regarding the ESL reading enhancement program, Reading Excellence™. The study found that participants experienced the program in a positive light. They experienced improved ESL reading as well as listening and writing proficiency. In addition, they experienced improved affective well-being in the sense that they generally felt more comfortable using ESL. This included feeling more self-confident in their experience of their academic environment. Interviews as well as document review resulted in dissonance, however: data pointed towards poor class attendance as well as a perturbing lack of progress in terms of reading comprehension and speed.

  8. Test experience on an ultrareliable computer communication network

    Science.gov (United States)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  9. Experiences using DAKOTA stochastic expansion methods in computational simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan; Ruthruff, Joseph R.

    2012-01-01

    Uncertainty quantification (UQ) methods bring rigorous statistical connections to the analysis of computational and experiment data, and provide a basis for probabilistically assessing margins associated with safety and reliability. The DAKOTA toolkit developed at Sandia National Laboratories implements a number of UQ methods, which are being increasingly adopted by modeling and simulation teams to facilitate these analyses. This report disseminates results as to the performance of DAKOTA's stochastic expansion methods for UQ on a representative application. Our results provide a number of insights that may be of interest to future users of these methods, including the behavior of the methods in estimating responses at varying probability levels, and the expansion levels for the methodologies that may be needed to achieve convergence.

  10. Alkali Rydberg states in electromagnetic fields: computational physics meets experiment

    International Nuclear Information System (INIS)

    Krug, A.

    2001-11-01

    We study highly excited hydrogen and alkali atoms ('Rydberg states') under the influence of a strong microwave field. As the external frequency is comparable to the highly excited electron's classical Kepler frequency, the external field induces a strong coupling of many different quantum mechanical energy levels and finally leads to the ionization of the outer electron. While periodically driven atomic hydrogen can be seen as a paradigm of quantum chaotic motion in an open (decaying) quantum system, the presence of the non-hydrogenic atomic core - which unavoidably has to be treated quantum mechanically - entails some complications. Indeed, laboratory experiments show clear differences in the ionization dynamics of microwave driven hydrogen and non-hydrogenic Rydberg states. In the first part of this thesis, a machinery is developed that allows for numerical experiments on alkali and hydrogen atoms under precisely identical laboratory conditions. Due to the high density of states in the parameter regime typically explored in laboratory experiments, such simulations are only possible with the most advanced parallel computing facilities, in combination with an efficient parallel implementation of the numerical approach. The second part of the thesis is devoted to the results of the numerical experiment. We identify and describe significant differences and surprising similarities in the ionization dynamics of atomic hydrogen as compared to alkali atoms, and give account of the relevant frequency scales that distinguish hydrogenic from non-hydrogenic ionization behavior. Our results necessitate a reinterpretation of the experimental results so far available, and solve the puzzle of a distinct ionization behavior of periodically driven hydrogen and non-hydrogenic Rydberg atoms - an unresolved question for about one decade. Finally, microwave-driven Rydberg states will be considered as prototypes of open, complex quantum systems that exhibit a complicated temporal decay

  11. Software and experience with managing workflows for the computing operation of the CMS experiment

    Science.gov (United States)

    Vlimant, Jean-Roch; CMS Collaboration

    2017-10-01

    We present a system deployed in the summer of 2015 for the automatic assignment of production and reprocessing workflows for simulation and detector data in the frame of the Computing Operation of the CMS experiment at the CERN LHC. Processing requests involves a number of steps in the daily operation, including transferring input datasets where relevant and monitoring them, assigning work to computing resources available on the CMS grid, and delivering the output to the Physics groups. Automation is critical above a certain number of requests to be handled, especially in the view of using more efficiently computing resources and reducing latency. An effort to automatize the necessary steps for production and reprocessing recently started and a new system to handle workflows has been developed. The state-machine system described consists in a set of modules whose key feature is the automatic placement of input datasets, balancing the load across multiple sites. By reducing the operation overhead, these agents enable the utilization of more than double the amount of resources with robust storage system. Additional functionality were added after months of successful operation to further balance the load on the computing system using remote read and additional resources. This system contributed to reducing the delivery time of datasets, a crucial aspect to the analysis of CMS data. We report on lessons learned from operation towards increased efficiency in using a largely heterogeneous distributed system of computing, storage and network elements.

  12. Early experiences of computer-aided assessment and administration when teaching computer programming

    Directory of Open Access Journals (Sweden)

    Abdullah Mohd Zin

    1993-12-01

    Full Text Available This paper describes early experiences with the Ceilidh system currently being piloted at over 30 institutions of higher education. Ceilidh is a course-management system for teaching computer programming whose core is an auto-assessment facility. This facility automatically marks students programs from a range of perspectives, and may be used in an iterative manner, enabling students to work towards a target level of attainment. Ceilidh also includes extensive courseadministration and progress-monitoring facilities, as well as support for other forms of assessment including short-answer marking and the collation of essays for later hand-marking. The paper discusses the motivation for developing Ceilidh, outlines its major facilities, then summarizes experiences of developing and actually using it at the coal-face over three years of teaching.

  13. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    International Nuclear Information System (INIS)

    Grieger, Khara D.; Laurent, Alexis; Miseljic, Mirko; Christensen, Frans; Baun, Anders; Olsen, Stig I.

    2012-01-01

    While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance on how to practically apply these methods are still very much under development. This paper evaluates how research efforts have applied LCA and RA together for NM, particularly reflecting on previous experiences with applying these methods to chemicals. Through a literature review and a separate analysis of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key “lessons learned” from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches for using these methods together for NM: “LC-based RA” (traditional RA applied in a life-cycle perspective) and “RA-complemented LCA” (conventional LCA supplemented by RA in specific life-cycle steps). Hence, the latter is the only identified approach which genuinely combines LC- and RA-based methods for NM-risk research efforts to date as the former is rather a continuation of normal RA according to standard assessment procedures (e.g., REACH). Both these approaches along with recommendations for using LCA and RA together for NM are similar to those made previously for chemicals, and thus, there does not appear to be much progress made specific for NM. We have identified one issue in particular that may be specific for NM when applying LCA and RA at this time: the need to establish proper dose metrics within both methods.

  14. Gravitational Acceleration Effects on Macrosegregation: Experiment and Computational Modeling

    Science.gov (United States)

    Leon-Torres, J.; Curreri, P. A.; Stefanescu, D. M.; Sen, S.

    1999-01-01

    Experiments were performed under terrestrial gravity (1g) and during parabolic flights (10-2 g) to study the solidification and macrosegregation patterns of Al-Cu alloys. Alloys having 2% and 5% Cu were solidified against a chill at two different cooling rates. Microscopic and Electron Microprobe characterization was used to produce microstructural and macrosegregation maps. In all cases positive segregation occurred next to the chill because shrinkage flow, as expected. This positive segregation was higher in the low-g samples, apparently because of the higher heat transfer coefficient. A 2-D computational model was used to explain the experimental results. The continuum formulation was employed to describe the macroscopic transports of mass, energy, and momentum, associated with the solidification phenomena, for a two-phase system. The model considers that liquid flow is driven by thermal and solutal buoyancy, and by solidification shrinkage. The solidification event was divided into two stages. In the first one, the liquid containing freely moving equiaxed grains was described through the relative viscosity concept. In the second stage, when a fixed dendritic network was formed after dendritic coherency, the mushy zone was treated as a porous medium. The macrosegregation maps and the cooling curves obtained during experiments were used for validation of the solidification and segregation model. The model can explain the solidification and macrosegregation patterns and the differences between low- and high-gravity results.

  15. Computationally mediated experiments: the next frontier in microscopy

    International Nuclear Information System (INIS)

    Zaluzec, N.J.

    2002-01-01

    Full text: It's reasonably safe to say that most of the simple experimental techniques that can be employed in microscopy have been well documented and exploited over the last 20 years. Thus, if we are interested in extending the range and diversity of problems that we will be dealing with in the next decade then we will have to takeup challenges which here-to-for were considered beyond the realm of routine work. Given the ever growing tendency to add computational resources to our instruments it is clear that the next breakthrough will be directly tied to how well we can effectively tie these two realms together. In the past we have used computers to simply speed up our experiments, but in the up coming decade the key will be to realize that once an effective interface of instrumentation and computational tools is developed we must change the way in which we design our experiments. This means re-examining how we do experiments so that measurements are done not just quickly, but precisely and to maximize the information measured so that the data therein can be 'mined' for content which might have been missed in the past. As example of this consider the experimental technique of Position Resolved Diffraction which is currently being developed for the study of nanoscale magnetic structures using ANL's Advanced Analytical Electron Microscope. Here a focused electron probe is sequentially scanned across a two dimensional field of view of a thin specimen and at each point on the specimen a two dimensional electron diffraction pattern is acquired and stored. Analysis of the spatial variation in the electron diffraction pattern allows a researcher to study the subtle changes resulting from microstructural differences such as ferro and electro magnetic domain formation and motion. There is, however, a severe limitation in this technique-namely its need to store and dynamically process large data sets preferably in near real time. A minimal scoping measurement would involve

  16. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  17. Multi-fidelity Gaussian process regression for computer experiments

    International Nuclear Information System (INIS)

    Le-Gratiet, Loic

    2013-01-01

    This work is on Gaussian-process based approximation of a code which can be run at different levels of accuracy. The goal is to improve the predictions of a surrogate model of a complex computer code using fast approximations of it. A new formulation of a co-kriging based method has been proposed. In particular this formulation allows for fast implementation and for closed-form expressions for the predictive mean and variance for universal co-kriging in the multi-fidelity framework, which is a breakthrough as it really allows for the practical application of such a method in real cases. Furthermore, fast cross validation, sequential experimental design and sensitivity analysis methods have been extended to the multi-fidelity co-kriging framework. This thesis also deals with a conjecture about the dependence of the learning curve (i.e. the decay rate of the mean square error) with respect to the smoothness of the underlying function. A proof in a fairly general situation (which includes the classical models of Gaussian-process based meta-models with stationary covariance functions) has been obtained while the previous proofs hold only for degenerate kernels (i.e. when the process is in fact finite- dimensional). This result allows for addressing rigorously practical questions such as the optimal allocation of the budget between different levels of codes in the multi-fidelity framework. (author) [fr

  18. Designing with an underdeveloped computational composite for materials experience

    NARCIS (Netherlands)

    Barati, B.; Karana, E.; Hekkert, P.P.M.; Jönsthövel, I.

    2015-01-01

    In response to the urge for multidisciplinary development of computational composites, designers and material scientists are increasingly involved in collaborative projects to valorize these technology-push materials in the early stages of their development. To further develop the computational

  19. Interactive Quantum Mechanics Quantum Experiments on the Computer

    CERN Document Server

    Brandt, S; Dahmen, H.D

    2011-01-01

    Extra Materials available on extras.springer.com INTERACTIVE QUANTUM MECHANICS allows students to perform their own quantum-physics experiments on their computer, in vivid 3D color graphics. Topics covered include: •        harmonic waves and wave packets, •        free particles as well as bound states and scattering in various potentials in one and three dimensions (both stationary and time dependent), •        two-particle systems, coupled harmonic oscillators, •        distinguishable and indistinguishable particles, •        coherent and squeezed states in time-dependent motion, •        quantized angular momentum, •        spin and magnetic resonance, •        hybridization. For the present edition the physics scope has been widened appreciably. Moreover, INTERQUANTA can now produce user-defined movies of quantum-mechanical situations. Movies can be viewed directly and also be saved to be shown later in any browser. Sections on spec...

  20. Parallel Fully-Implicit Computation of Magnetohydrodynamics Acceleration Experiments

    Science.gov (United States)

    Wan, Tian; Candler, Graham

    2010-05-01

    A three-dimensional MHD solver is described in the paper. The solver simulates reacting flows with nonequilibrium between translational-rotational, vibrational and electron translational modes. The conservation equations are discretized with implicit time marching and the second-order modified Steger-Warming scheme, and the resulted linear system is solved iteratively with Newton-Krylov-Schwarz method that is implemented by PETSc package. The results of convergence tests are plotted, which show good scalability and convergence around twice faster when compared with the DPLR method. Then five test runs are conducted simulating the experiments done at the NASA Ames MHD channel, and the calculated pressures, temperatures, electrical conductivity, back EMF, load factors and flow accelerations are shown to agree with the experimental data. Our computation shows that the electrical conductivity distribution is not uniform in the powered section of the MHD channel, and that it is important to include Joule heating in order to calculate the correct conductivity and the MHD acceleration.

  1. Quantum computation and quantum communication theory and experiments

    CERN Document Server

    Pavicic, Mladen

    2005-01-01

    The field of quantum computing has experienced rapid development and many different experimental and theoretical groups have emerged worldwide.This book presents the key elements of quantum computation and communication theories and their implementation in an easy-to-read manner for readers coming from physics, mathematics and computer science backgrounds. Integrating both theoretical aspects and experimental verifications of developing quantum computers, the author explains why particular mathematical methods, physical models and realistic implementations might provide critical steps towards achieving the final goal - constructing quantum computers and quantum networks. The book serves as an excellent introduction for new researchers and also provides a useful review for specialists in the field.

  2. Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments.

    OpenAIRE

    Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick

    2008-01-01

    Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.

  3. On-Line Digital Computer Applications in Gas Chromatography, An Undergraduate Analytical Experiment

    Science.gov (United States)

    Perone, S. P.; Eagleston, J. F.

    1971-01-01

    Presented are some descriptive background materials and the directions for an experiment which provides an introduction to on-line computer instrumentation. Assumes students are familiar with the Purdue Real-Time Basic (PRTB) laboratory computer system. (PR)

  4. Developments of the data reduction system for the nuclear experiments with the micro computer

    International Nuclear Information System (INIS)

    Okihana, Akira; Hata, Takahiro; Irie, Hiromu; Umeda, Toshiya.

    1984-01-01

    The data reduction system with the micro computer was studied and applied to the nuclear experiments. After taking data by using the multi channel analyzer, they are transmitted to the micro computer and analyzed with it. The method of the data transmission from the micro computer to the MELCOM-700 computer are also reported. This program system is a powerful tool for the nuclear experiments and it seems to be applied to other spectroscopic studies. (author)

  5. An Experiment Support Computer for Externally-Based ISS Payloads

    Science.gov (United States)

    Sell, S. W.; Chen, S. E.

    2002-01-01

    The Experiment Support Facility - External (ESF-X) is a computer designed for general experiment use aboard the International Space Station (ISS) Truss Site locations. The ESF-X design is highly modular and uses commercial off-the-shelf (COTS) components wherever possible to allow for maximum reconfigurability to meet the needs of almost any payload. The ESF-X design has been developed with the EXPRESS Pallet as the target location and the University of Colorado's Micron Accuracy Deployment Experiment (MADE) as the anticipated first payload and capability driver. Thus the design presented here is configured for structural dynamics and control as well as optics experiments. The ESF-X is a small (58.4 x 48.3 x 17.8") steel and copper enclosure which houses a 14 slot VME card chassis and power supply. All power and data connections are made through a single panel on the enclosure so that only one side of the enclosure must be accessed for nominal operation and servicing activities. This feature also allows convenient access during integration and checkout activities. Because it utilizes a standard VME backplane, ESF-X can make use of the many commercial boards already in production for this standard. Since the VME standard is also heavily used in industrial and military applications, many ruggedized components are readily available. The baseline design includes commercial processors, Ethernet, MIL-STD-1553, and mass storage devices. The main processor board contains four TI 6701 DSPs with a PowerPC based controller. Other standard functions, such as analog-to-digital, digital-to-analog, motor driver, temperature readings, etc., are handled on industry-standard IP modules. Carrier cards, which hold 4 IP modules each, are placed in slots in the VME backplane. A unique, custom IP carrier board with radiation event detectors allows non RAD-hard components to be used in an extended exposure environment. Thermal control is maintained by conductive cooling through the copper

  6. Experiences with Efficient Methodologies for Teaching Computer Programming to Geoscientists

    Science.gov (United States)

    Jacobs, Christian T.; Gorman, Gerard J.; Rees, Huw E.; Craig, Lorraine E.

    2016-01-01

    Computer programming was once thought of as a skill required only by professional software developers. But today, given the ubiquitous nature of computation and data science it is quickly becoming necessary for all scientists and engineers to have at least a basic knowledge of how to program. Teaching how to program, particularly to those students…

  7. Using Real-Life Experiences to Teach Computer Concepts

    Science.gov (United States)

    Read, Alexis

    2012-01-01

    Teaching computer concepts to individuals with visual impairments (that is, those who are blind or visually impaired) presents some unique challenges. Students often have difficulty remembering to perform certain steps or have difficulty remembering specific keystrokes when using computers. Many cannot visualize the way in which complex computing…

  8. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  9. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  10. Computers and Schools: Ideas and Experiences Concerning the Use of Computers For Instructional Purposes

    Science.gov (United States)

    Zielinski, Johannes

    1969-01-01

    Stresses the importance of the computer for meeting the need for increased productivity in education, caused by the population and information explosions. Explains three trial uses of the computer in West Germany. (DE)

  11. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  12. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  13. Assessing Pre-Service Teachers' Computer Phobia Levels in Terms of Gender and Experience, Turkish Sample

    Science.gov (United States)

    Ursavas, Omer Faruk; Karal, Hasan

    2009-01-01

    In this study it is aimed to determine the level of pre-service teachers' computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were…

  14. Computer Experience of Menominee Indian Students: Gender Differences in Coursework and Use of Software.

    Science.gov (United States)

    Grignon, Jerilyn R.

    1993-01-01

    Among 71 eighth and twelfth graders surveyed in Menominee Indian School District (Wisconsin), eighth-grade males and females had similar school experiences with computers, but twelfth-grade females were significantly less likely than males to enroll in computer-oriented classes or to use computer games and graphics. (SV)

  15. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  16. User expectations and experiences of a speech and thought controlled computer game

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, Gido; Poel, Mannes; Nijholt, Antinus; Romão, T.; Correia, N.; Inami, M.; Kato, H.; Prada, R.; Terada, T.; Dias, E.; Chambel, T.

    2011-01-01

    Brain-computer interfaces (BCIs) are often evaluated in terms of performance and seldom for usability. However in some application domains, such as entertainment computing, user experience evaluation is vital. User experience evaluation in BCI systems, especially in entertainment applications such

  17. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    Science.gov (United States)

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  18. A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)

    Science.gov (United States)

    Rhew, Ray D.; Parker, Peter A.

    2007-01-01

    Design of Experiments (DOE) was applied to the LAS geometric parameter study to efficiently identify and rank primary contributors to integrated drag over the vehicles ascent trajectory in an order of magnitude fewer CFD configurations thereby reducing computational resources and solution time. SME s were able to gain a better understanding on the underlying flowphysics of different geometric parameter configurations through the identification of interaction effects. An interaction effect, which describes how the effect of one factor changes with respect to the levels of other factors, is often the key to product optimization. A DOE approach emphasizes a sequential approach to learning through successive experimentation to continuously build on previous knowledge. These studies represent a starting point for expanded experimental activities that will eventually cover the entire design space of the vehicle and flight trajectory.

  19. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    Science.gov (United States)

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  20. Perspectives on distributed computing : thirty people, four user types, and the distributed computing user experience.

    Energy Technology Data Exchange (ETDEWEB)

    Childers, L.; Liming, L.; Foster, I.; Mathematics and Computer Science; Univ. of Chicago

    2008-10-15

    This report summarizes the methodology and results of a user perspectives study conducted by the Community Driven Improvement of Globus Software (CDIGS) project. The purpose of the study was to document the work-related goals and challenges facing today's scientific technology users, to record their perspectives on Globus software and the distributed-computing ecosystem, and to provide recommendations to the Globus community based on the observations. Globus is a set of open source software components intended to provide a framework for collaborative computational science activities. Rather than attempting to characterize all users or potential users of Globus software, our strategy has been to speak in detail with a small group of individuals in the scientific community whose work appears to be the kind that could benefit from Globus software, learn as much as possible about their work goals and the challenges they face, and describe what we found. The result is a set of statements about specific individuals experiences. We do not claim that these are representative of a potential user community, but we do claim to have found commonalities and differences among the interviewees that may be reflected in the user community as a whole. We present these as a series of hypotheses that can be tested by subsequent studies, and we offer recommendations to Globus developers based on the assumption that these hypotheses are representative. Specifically, we conducted interviews with thirty technology users in the scientific community. We included both people who have used Globus software and those who have not. We made a point of including individuals who represent a variety of roles in scientific projects, for example, scientists, software developers, engineers, and infrastructure providers. The following material is included in this report: (1) A summary of the reported work-related goals, significant issues, and points of satisfaction with the use of Globus software

  1. Status of the Grid Computing for the ALICE Experiment in the Czech Republic

    International Nuclear Information System (INIS)

    Adamova, D; Hampl, J; Chudoba, J; Kouba, T; Svec, J; Mendez, Lorenzo P; Saiz, P

    2010-01-01

    The Czech Republic (CR) has been participating in the LHC Computing Grid project (LCG) ever since 2003 and gradually, a middle-sized Tier-2 center has been built in Prague, delivering computing services for national HEP experiments groups including the ALICE project at the LHC. We present a brief overview of the computing activities and services being performed in the CR for the ALICE experiment.

  2. Computer control and monitoring of neutral beam injectors on the 2XIIB CTR experiment at LLL

    International Nuclear Information System (INIS)

    Pollock, G.G.

    1975-01-01

    The original manual control system for the 12 neutral beam injectors on the 2XIIB Machine is being integrated with a computer control system. This, in turn, is a part of a multiple computer network comprised of the three computers which are involved in the operation and instrumentation of the 2XIIB experiment. The computer control system simplifies neutral beam operation and centralizes it to a single operating position. A special purpose console utilizes computer generated graphics and interactive function entry buttons to optimize the human/machine interface. Through the facilities of the computer network, a high level control function will be implemented for the use of the experimenter in a remotely located experiment diagnositcs area. In addition to controlling the injectors in normal operation, the computer system provides automatic conditioning of the injectors, bringing rebuilt units back to full energy output with minimum loss of useful life. The computer system also provides detail archive data recording

  3. Progress in hypersonic combustion technology with computation and experiment

    Science.gov (United States)

    Anderson, Griffin Y.; Kumar, Ajay; Erdos, John I.

    1990-01-01

    Design of successful airbreathing engines for operation at near-orbital speeds presents significant challenges in all the disciplines involved, including propulsion. This paper presents a discussion of the important physics of hypersonic combustion and an assessment of the state of the art of ground simulations with pulse facilities and with computational techniques. Recent examples of experimental and computational simulations are presented and discussed. The need for continued application of these tools to establish the credibility and fidelity of engineering design methods for practical hypersonic combustors is emphasized along with the critical need for improved diagnostic methods for hypervelocity reacting flows.

  4. Computing Activities for the PANDA Experiment at FAIR

    NARCIS (Netherlands)

    Messchendorp, Johan; Gruntorad, J; Lokajicek, M

    2010-01-01

    The PANDA experiment at the future facility FAIR will provide valuable data for our present understanding of the strong interaction. In preparation for the experiments, large-scale simulations for design and feasibility studies are performed exploiting a new software framework, PandaROOT, which is

  5. "It's Boring": Female Students' Experience of Studying ICT and Computing

    Science.gov (United States)

    Pau, Reena; Hall, Wendy; Grace, Marcus

    2011-01-01

    The declining number of women in computing is a cause for concern for those in education and the IT industry. A diverse workforce is necessary for there to be a creative balance in the IT industry. The reasons for this decline are varied and can be attributed to factors such as schooling, parental influences and the media. This article focuses on…

  6. Trainee Teachers' e-Learning Experiences of Computer Play

    Science.gov (United States)

    Wright, Pam

    2009-01-01

    Pam Wright highlights the role of technology in providing situated learning opportunities for preservice teachers to explore the role commercial computer games may have in primary education. In a study designed to assess the effectiveness of an online unit on gaming incorporated into a course on learning technologies, Wright found that thoughtful…

  7. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  8. Experience of public procurement of Open Compute servers

    Science.gov (United States)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  9. Beyond Technology in Computer Assisted Language Learning: Learners' Experiences

    Science.gov (United States)

    Plana, Mar Gutiérrez-Colon; Ballester, Elisabet Pladevall

    2009-01-01

    The present study is based on a previous pilot study (Gutiérrez-Colon, 2008). The present study aimed at widening the scope of the pilot study increasing the sample size in number of participants, degree courses and number of universities. This time, four Spanish universities were involved, and the number of participants was 197, who were…

  10. Blast Load Simulator Experiments for Computational Model Validation Report 3

    Science.gov (United States)

    2017-07-01

    establish confidence in the simulation results specific to their intended use. One method for providing experimental data for computational model...walls, to higher blast pressures required to evaluate the performance of protective construction methods . Figure 1. ERDC Blast Load Simulator (BLS... Instrumentation included 3 pressure gauges mounted on the steel calibration plate, 2 pressure gauges mounted in the wall of the BLS, and 25 pressure gauges

  11. Analysis of Computer Experiments with Multiple Noise Sources

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    2010-01-01

    In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled eff...... effectively with linear mixed effects models and generalized additive models. Copyright (C) 2009 John Wiley & Sons, Ltd....

  12. Simulation in computer forensics teaching: the student experience

    OpenAIRE

    Crellin, Jonathan; Adda, Mo; Duke-Williams, Emma; Chandler, Jane

    2011-01-01

    The use of simulation in teaching computing is well established, with digital forensic investigation being a subject area where the range of simulation required is both wide and varied demanding a corresponding breadth of fidelity. Each type of simulation can be complex and expensive to set up resulting in students having only limited opportunities to participate and learn from the simulation. For example students' participation in mock trials in the University mock courtroom or in simulation...

  13. Assessing computer skills in Tanzanian medical students: an elective experience

    Directory of Open Access Journals (Sweden)

    Melvin Rob

    2004-08-01

    Full Text Available Abstract Background One estimate suggests that by 2010 more than 30% of a physician's time will be spent using information technology tools. The aim of this study is to assess the information and communication technologies (ICT skills of medical students in Tanzania. We also report a pilot intervention of peer mentoring training in ICT by medical students from the UK tutoring students in Tanzania. Methods Design: Cross sectional study and pilot intervention study. Participants: Fourth year medical students (n = 92 attending Muhimbili University College of Health Sciences, Dar es Salaam, Tanzania. Main outcome measures: Self-reported assessment of competence on ICT-related topics and ability to perform specific ICT tasks. Further information related to frequency of computer use (hours per week, years of computer use, reasons for use and access to computers. Skills at specific tasks were reassessed for 12 students following 4 to 6 hours of peer mentoring training. Results The highest levels of competence in generic ICT areas were for email, Internet and file management. For other skills such as word processing most respondents reported low levels of competence. The abilities to perform specific ICT skills were low – less than 60% of the participants were able to perform the core specific skills assessed. A period of approximately 5 hours of peer mentoring training produced an approximate doubling of competence scores for these skills. Conclusion Our study has found a low level of ability to use ICT facilities among medical students in a leading university in sub-Saharan Africa. A pilot scheme utilising UK elective students to tutor basic skills showed potential. Attention is required to develop interventions that can improve ICT skills, as well as computer access, in order to bridge the digital divide.

  14. Assessing computer skills in Tanzanian medical students: an elective experience.

    Science.gov (United States)

    Samuel, Miriam; Coombes, John C; Miranda, J Jaime; Melvin, Rob; Young, Eoin J W; Azarmina, Pejman

    2004-08-12

    One estimate suggests that by 2010 more than 30% of a physician's time will be spent using information technology tools. The aim of this study is to assess the information and communication technologies (ICT) skills of medical students in Tanzania. We also report a pilot intervention of peer mentoring training in ICT by medical students from the UK tutoring students in Tanzania. Cross sectional study and pilot intervention study. Fourth year medical students (n = 92) attending Muhimbili University College of Health Sciences, Dar es Salaam, Tanzania. Self-reported assessment of competence on ICT-related topics and ability to perform specific ICT tasks. Further information related to frequency of computer use (hours per week), years of computer use, reasons for use and access to computers. Skills at specific tasks were reassessed for 12 students following 4 to 6 hours of peer mentoring training. The highest levels of competence in generic ICT areas were for email, Internet and file management. For other skills such as word processing most respondents reported low levels of competence. The abilities to perform specific ICT skills were low - less than 60% of the participants were able to perform the core specific skills assessed. A period of approximately 5 hours of peer mentoring training produced an approximate doubling of competence scores for these skills. Our study has found a low level of ability to use ICT facilities among medical students in a leading university in sub-Saharan Africa. A pilot scheme utilising UK elective students to tutor basic skills showed potential. Attention is required to develop interventions that can improve ICT skills, as well as computer access, in order to bridge the digital divide.

  15. TRANSFORMING RURAL SECONDARY SCHOOLS IN ZIMBABWE THROUGH TECHNOLOGY: LIVED EXPERIENCES OF STUDENT COMPUTER USERS

    Directory of Open Access Journals (Sweden)

    Gomba Clifford

    2016-04-01

    Full Text Available A technological divide exists in Zimbabwe between urban and rural schools that puts rural based students at a disadvantage. In Zimbabwe, the government, through the president donated computers to most rural schools in a bid to bridge the digital divide between rural and urban schools. The purpose of this phenomenological study was to understand the experiences of Advanced Level students using computers at two rural boarding Catholic High Schools in Zimbabwe. The study was guided by two research questions: (1 How do Advanced level students in the rural areas use computers at their school? and (2 What is the experience of using computers for Advanced Level students in the rural areas of Zimbabwe? By performing this study, it was possible to understand from the students’ experiences whether computer usage was for educational learning or not. The results of the phenomenological study showed that students’ experiences can be broadly classified into five themes, namely worthwhile (interesting experience, accessibility issues, teachers’ monopoly, research and social use, and Internet availability. The participants proposed teachers use computers, but not monopolize computer usage. The solution to the computer shortage may be solved by having donors and government help in the acquisitioning of more computers.

  16. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  17. Parallel Computational Fluid Dynamics 2007 : Implementations and Experiences on Large Scale and Grid Computing

    CERN Document Server

    2009-01-01

    At the 19th Annual Conference on Parallel Computational Fluid Dynamics held in Antalya, Turkey, in May 2007, the most recent developments and implementations of large-scale and grid computing were presented. This book, comprised of the invited and selected papers of this conference, details those advances, which are of particular interest to CFD and CFD-related communities. It also offers the results related to applications of various scientific and engineering problems involving flows and flow-related topics. Intended for CFD researchers and graduate students, this book is a state-of-the-art presentation of the relevant methodology and implementation techniques of large-scale computing.

  18. Use of Intracervical Foley Catheter for Induction of Labour in Cases of Previous Caesarean Section: Experience of a single tertiary centre in Oman.

    Science.gov (United States)

    Gonsalves, Hazel; Al-Riyami, Nihal; Al-Dughaishi, Tamima; Gowri, Vaidayanathan; Al-Azri, Mohammed; Salahuddin, Ayesha

    2016-11-01

    This study aimed to evaluate rates of success and perinatal complications of labour induction using an intracervical Foley catheter among women with a previous Caesarean delivery at a tertiary centre in Oman. This retrospective cohort study included 68 pregnant women with a history of a previous Caesarean section who were admitted for induction via Foley catheter between January 2011 and December 2013 to the Sultan Qaboos University Hospital, Muscat, Oman. Patient data were collected from electronic and delivery ward records. Most women were 25-35 years old (76.5%) and 20 women had had one previous vaginal delivery (29.4%). The most common indication for induction of labour was intrauterine growth restriction with oligohydramnios (27.9%). Most women delivered after 40 gestational weeks (48.5%) and there were no neonatal admissions or complications. The majority experienced no complications during the induction period (85.3%), although a few had vaginal bleeding (5.9%), intrapartum fever (4.4%), rupture of the membranes (2.9%) and cord prolapse shortly after insertion of the Foley catheter (1.5%). However, no cases of uterine rupture or scar dehiscence were noted. Overall, the success rate of vaginal birth after a previous Caesarean delivery was 69.1%, with the remaining patients undergoing an emergency Caesarean section (30.9%). The use of a Foley catheter in the induction of labour in women with a previous Caesarean delivery appears a safe option with a good success rate and few maternal and fetal complications.

  19. Quantum Computation and Information From Theory to Experiment

    CERN Document Server

    Imai, Hiroshi

    2006-01-01

    Recently, the field of quantum computation and information has been developing through a fusion of results from various research fields in theoretical and practical areas. This book consists of the reviews of selected topics charterized by great progress and cover the field from theoretical areas to experimental ones. It contains fundamental areas, quantum query complexity, quantum statistical inference, quantum cloning, quantum entanglement, additivity. It treats three types of quantum security system, quantum public key cryptography, quantum key distribution, and quantum steganography. A photonic system is highlighted for the realization of quantum information processing.

  20. COMPUTER EXPERIMENTS WITH FINITE ELEMENTS OF HIGHER ORDER

    Directory of Open Access Journals (Sweden)

    Khomchenko A.

    2017-12-01

    Full Text Available The paper deals with the problem of constructing the basic functions of a quadrilateral finite element of the fifth order by the means of the computer algebra system Maple. The Lagrangian approximation of such a finite element contains 36 nodes: 20 nodes perimeter and 16 internal nodes. Alternative models with reduced number of internal nodes are considered. Graphs of basic functions and cognitive portraits of lines of zero level are presented. The work is aimed at studying the possibilities of using modern information technologies in the teaching of individual mathematical disciplines.

  1. Computer-Assisted Experiments with a Laser Diode

    Science.gov (United States)

    Kraftmakher, Yaakov

    2011-01-01

    A laser diode from an inexpensive laser pen (laser pointer) is used in simple experiments. The radiant output power and efficiency of the laser are measured, and polarization of the light beam is shown. The "h/e" ratio is available from the threshold of spontaneous emission. The lasing threshold is found using several methods. With a…

  2. Introduction to Classical Density Functional Theory by a Computational Experiment

    Science.gov (United States)

    Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel

    2014-01-01

    We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…

  3. Student and Faculty Perceptions of Undergraduate Research Experiences in Computing

    Science.gov (United States)

    Barker, L.

    2009-01-01

    Undergraduate research experiences are promoted and funded for their potential in increasing students' likelihood of pursuing graduate degrees, increasing their confidence, and expanding their awareness of their discipline and career opportunities. These outcomes, however, depend on the social, organizational, and intellectual conditions under…

  4. Experience with computed transmission tomography of the heart in vivo

    International Nuclear Information System (INIS)

    Carlsson, E.; Lipton, M.J.; Skioeldebrand, C.G.; Berninger, W.H.; Redington, R.W.

    1980-01-01

    Cardiac computed tomography in its present form provides useful information about the heart for clinical use in patients with heart disease and for investigative work in such patients and living animals. Its great reconstructive power and unmatched density resolution are particularly advantageous in the study of ischemic heart disease. Because of its non-invasive character cardiac computed tomography has the potential of becoming an effective screening tool for large numbers of patients with suspected or known coronary heart desiase. Other cardiac conditions such as valve disease and congenital lesions can also be examined with high diagnostic yield. However presently available scanners suffer from low repetion rate, long scan times and the fact that only one transverse cardiac level at a time can be obtained. The development which must be accomplished in order to eliminate these weaknesses is technically feasible. The availability of a dynamic cardiac scanner would greatly benefit the treatment of patients with heart disease and facilitate the inquiry into the pathophysiology of such diseases. (orig.) [de

  5. Assessing the impact of previous experience, and attitudes towards technology, on levels of engagement in a virtual reality based occupational therapy intervention for spinal cord injury rehabilitation

    LENUS (Irish Health Repository)

    McCaughey, Manus Dr.

    2007-01-01

    The aim of the current research project was to determine if there were significant differences between patients with higher or lower levels of experience with technology in terms of their level of engagement with virtual reality (VR) in occupational therapy, their future uptake of VR technology in therapy, and their attitudes towards technology. Patients’ experience of technology was also examined in relation to demographic characteristics such as age and education level.\\r\

  6. Dropping Out of Computer Science: A Phenomenological Study of Student Lived Experiences in Community College Computer Science

    Science.gov (United States)

    Gilbert-Valencia, Daniel H.

    California community colleges contribute alarmingly few computer science degree or certificate earners. While the literature shows clear K-12 impediments to CS matriculation in higher education, very little is known about the experiences of those who overcome initial impediments to CS yet do not persist through to program completion. This phenomenological study explores insights into that specific experience by interviewing underrepresented, low income, first-generation college students who began community college intending to transfer to 4-year institutions majoring in CS but switched to another field and remain enrolled or graduated. This study explores the lived experiences of students facing barriers, their avenues for developing interest in CS, and the persistence support systems they encountered, specifically looking at how students constructed their academic choice from these experiences. The growing diversity within California's population necessitates that experiences specific to underrepresented students be considered as part of this exploration. Ten semi-structured interviews and observations were conducted, transcribed and coded. Artifacts supporting student experiences were also collected. Data was analyzed through a social-constructivist lens to provide insight into experiences and how they can be navigated to create actionable strategies for community college computer science departments wishing to increase student success. Three major themes emerged from this research: (1) students shared pre-college characteristics; (2) faced similar challenges in college CS courses; and (3) shared similar reactions to the "work" of computer science. Results of the study included (1) CS interest development hinged on computer ownership in the home; (2) participants shared characteristics that were ideal for college success but not CS success; and (3) encounters in CS departments produced unique challenges for participants. Though CS interest was and remains

  7. Computing in high-energy physics: facing a new generation of experiments

    International Nuclear Information System (INIS)

    Zanella, P.

    1983-01-01

    Computing pervades nearly every aspect of activities in contemporary high-energy physics. The paper discusses the range of tasks requiring computing, and reviews the principal wyas in which these are handled. Examples are given of typical computing applications with particular reference to activities at CERN, and some attempt is made to identify the main trends. The new generation of experiments, typified by colliding beam facilities, creates new requirements for computing and distributed processing. These are discussed in the light of the new and developing computer technology, which is seen as being essential to satisfy these requirements. (Auth.)

  8. Integration of genetic algorithm, computer simulation and design of experiments for forecasting electrical energy consumption

    International Nuclear Information System (INIS)

    Azadeh, A.; Tarverdian, S.

    2007-01-01

    This study presents an integrated algorithm for forecasting monthly electrical energy consumption based on genetic algorithm (GA), computer simulation and design of experiments using stochastic procedures. First, time-series model is developed as a benchmark for GA and simulation. Computer simulation is developed to generate random variables for monthly electricity consumption. This is achieved to foresee the effects of probabilistic distribution on monthly electricity consumption. The GA and simulated-based GA models are then developed by the selected time-series model. Therefore, there are four treatments to be considered in analysis of variance (ANOVA) which are actual data, time series, GA and simulated-based GA. Furthermore, ANOVA is used to test the null hypothesis of the above four alternatives being equal. If the null hypothesis is accepted, then the lowest mean absolute percentage error (MAPE) value is used to select the best model, otherwise the Duncan Multiple Range Test (DMRT) method of paired comparison is used to select the optimum model, which could be time series, GA or simulated-based GA. In case of ties the lowest MAPE value is considered as the benchmark. The integrated algorithm has several unique features. First, it is flexible and identifies the best model based on the results of ANOVA and MAPE, whereas previous studies consider the best-fit GA model based on MAPE or relative error results. Second, the proposed algorithm may identify conventional time series as the best model for future electricity consumption forecasting because of its dynamic structure, whereas previous studies assume that GA always provide the best solutions and estimation. To show the applicability and superiority of the proposed algorithm, the monthly electricity consumption in Iran from March 1994 to February 2005 (131 months) is used and applied to the proposed algorithm

  9. Study of some physical aspects previous to design of an exponential experiment; Estudio de algunos aspectos fisicos previos al diseno de una experiencia exponencial

    Energy Technology Data Exchange (ETDEWEB)

    Caro, R.; Francisco, J. L. de

    1961-07-01

    This report presents the theoretical study of some physical aspects previous to the design of an exponential facility. The are: Fast and slow flux distribution in the multiplicative medium and in the thermal column, slowing down in the thermal column, geometrical distribution and minimum needed intensity of sources access channels and perturbations produced by possible variations in its position and intensity. (Author) 4 refs.

  10. Use of Intracervical Foley Catheter for Induction of Labour in Cases of Previous Caesarean Section; Experience of a single tertiary centre in Oman

    Directory of Open Access Journals (Sweden)

    Hazel Gonsalves

    2016-11-01

    Full Text Available Objectives: This study aimed to evaluate rates of success and perinatal complications of labour induction using an intracervical Foley catheter among women with a previous Caesarean delivery at a tertiary centre in Oman. Methods: This retrospective cohort study included 68 pregnant women with a history of a previous Caesarean section who were admitted for induction via Foley catheter between January 2011 and December 2013 to the Sultan Qaboos University Hospital, Muscat, Oman. Patient data were collected from electronic and delivery ward records. Results: Most women were 25–35 years old (76.5% and 20 women had had one previous vaginal delivery (29.4%. The most common indication for induction of labour was intrauterine growth restriction with oligohydramnios (27.9%. Most women delivered after 40 gestational weeks (48.5% and there were no neonatal admissions or complications. The majority experienced no complications during the induction period (85.3%, although a few had vaginal bleeding (5.9%, intrapartum fever (4.4%, rupture of the membranes (2.9% and cord prolapse shortly after insertion of the Foley catheter (1.5%. However, no cases of uterine rupture or scar dehiscence were noted. Overall, the success rate of vaginal birth after a previous Caesarean delivery was 69.1%, with the remaining patients undergoing an emergency Caesarean section (30.9%. Conclusion: The use of a Foley catheter in the induction of labour in women with a previous Caesarean delivery appears a safe option with a good success rate and few maternal and fetal complications.

  11. Design of Electronic Experiments Using Computer Generated Virtual Instruments

    Science.gov (United States)

    1994-03-01

    is displayed on the front panel DC Voltage meter. C LABORATORY 4 DESIGN The original Laboratory 4, Transistor ( BJT ) Characteristics, experiment...voltage relations of an NPN transistor in a common-emitter circuit configuration used in both the static and dynamic operation. 5. Transistor curve...of a BJT common emitter amplifier to stated specifications, test it for prop biasing signal amplification characteristics and operational stability. 7

  12. Design concepts and experience in the application of distributed computing to the control of large CEGB power plant

    International Nuclear Information System (INIS)

    Wallace, J.N.

    1980-01-01

    With the ever increasing price of fossil fuels it became obvious during the 1970's that Pembroke Power Station (4 x 500MW oil fired) and Didcot Power Station (4 x 500MW coal fired) were going to operate flexibly with many units two-shifting frequently. The region was also expecting to refurbish nuclear plant in the 1980's. Based on previous experience with mini-computers, the region initiated a research/development programme aimed at refitting Pembroke and Didcot using distrubuted computer techniques that were also broadly applicable to nuclear plant. Major schemes have now been implemented at Pembroke and Didcot for plant condition monitoring, control and display. All computers on two units at each station are now functional with a third unit currently being set to work. This paper aims to outline the generic technical aspects of these schemes, describe the implementation strategy adopted and develop some thoughts on nuclear power plant applications. (auth)

  13. Computer experiments with a coarse-grid hydrodynamic climate model

    International Nuclear Information System (INIS)

    Stenchikov, G.L.

    1990-01-01

    A climate model is developed on the basis of the two-level Mintz-Arakawa general circulation model of the atmosphere and a bulk model of the upper layer of the ocean. A detailed model of the spectral transport of shortwave and longwave radiation is used to investigate the radiative effects of greenhouse gases. The radiative fluxes are calculated at the boundaries of five layers, each with a pressure thickness of about 200 mb. The results of the climate sensitivity calculations for mean-annual and perpetual seasonal regimes are discussed. The CCAS (Computer Center of the Academy of Sciences) climate model is used to investigate the climatic effects of anthropogenic changes of the optical properties of the atmosphere due to increasing CO 2 content and aerosol pollution, and to calculate the sensitivity to changes of land surface albedo and humidity

  14. A model ecosystem experiment and its computational simulation studies

    International Nuclear Information System (INIS)

    Doi, M.

    2002-01-01

    Simplified microbial model ecosystem and its computer simulation model are introduced as eco-toxicity test for the assessment of environmental responses from the effects of environmental impacts. To take the effects on the interactions between species and environment into account, one option is to select the keystone species on the basis of ecological knowledge, and to put it in the single-species toxicity test. Another option proposed is to put the eco-toxicity tests as experimental micro ecosystem study and a theoretical model ecosystem analysis. With these tests, the stressors which are more harmful to the ecosystems should be replace with less harmful ones on the basis of unified measures. Management of radioactive materials, chemicals, hyper-eutrophic, and other artificial disturbances of ecosystem should be discussed consistently from the unified view point of environmental protection. (N.C.)

  15. Evolution of the Distributed Computing Model of the CMS experiment at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Grandi, C. [Bologna U.; Bockelman, B. [Nebraska U.; Bonacorsi, D. [Bologna U.; Donvito, G. [INFN, Bari; Dykstra, D. [Fermilab; Fisk, I. [Fermilab; Hernandez, J. [Bristol U.; Metson, S. [Bristol U.; Sfiligoi, I. [UC, San Diego; Wakefield, S. [Imperial Coll., London

    2012-01-01

    The Computing Model of the CMS experiment was prepared in 2005 and described in detail in the CMS Computing Technical Design Report. With the experience of the first years of LHC data taking and with the evolution of the available technologies, the CMS Collaboration identified areas where improvements were desirable. In this work we describe the most important modifications that have been, or are being implemented in the Distributed Computing Model of CMS. The Worldwide LHC computing Grid (WLCG) project acknowledged that the whole distributed computing infrastructure is impacted by this kind of changes that are happening in most LHC experiments and decided to create several Technical Evolution Groups (TEG) aiming at assessing the situation and at developing a strategy for the future. In this work we describe the CMS view on the TEG activities as well.

  16. A Supersonic Argon/Air Coaxial Jet Experiment for Computational Fluid Dynamics Code Validation

    Science.gov (United States)

    Clifton, Chandler W.; Cutler, Andrew D.

    2007-01-01

    A non-reacting experiment is described in which data has been acquired for the validation of CFD codes used to design high-speed air-breathing engines. A coaxial jet-nozzle has been designed to produce pressure-matched exit flows of Mach 1.8 at 1 atm in both a center jet of argon and a coflow jet of air, creating a supersonic, incompressible mixing layer. The flowfield was surveyed using total temperature, gas composition, and Pitot probes. The data set was compared to CFD code predictions made using Vulcan, a structured grid Navier-Stokes code, as well as to data from a previous experiment in which a He-O2 mixture was used instead of argon in the center jet of the same coaxial jet assembly. Comparison of experimental data from the argon flowfield and its computational prediction shows that the CFD produces an accurate solution for most of the measured flowfield. However, the CFD prediction deviates from the experimental data in the region downstream of x/D = 4, underpredicting the mixing-layer growth rate.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  19. COMPUTING

    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  20. Considerations for Explosively Driven Conical Shock Tube Design: Computations and Experiments

    Science.gov (United States)

    2017-02-16

    interest have been listed elsewhere (e.g., Courtney et al. 2012), but a limited list of advantages and disadvantages for explosively driven shock...ARL-TR-7953 ● FEB 2017 US Army Research Laboratory Considerations for Explosively Driven Conical Shock Tube Design: Computations ...Considerations for Explosively Driven Conical Shock Tube Designs: Computations and Experiments by Joel B Stewart Weapons and Materials Research Directorate

  1. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  2. Using Educational Computer Games in the Classroom: Science Teachers' Experiences, Attitudes, Perceptions, Concerns, and Support Needs

    Science.gov (United States)

    An, Yun-Jo; Haynes, Linda; D'Alba, Adriana; Chumney, Frances

    2016-01-01

    Science teachers' experiences, attitudes, perceptions, concerns, and support needs related to the use of educational computer games were investigated in this study. Data were collected from an online survey, which was completed by 111 science teachers. The results showed that 73% of participants had used computer games in teaching. Participants…

  3. Task-Relevant Sound and User Experience in Computer-Mediated Firefighter Training

    Science.gov (United States)

    Houtkamp, Joske M.; Toet, Alexander; Bos, Frank A.

    2012-01-01

    The authors added task-relevant sounds to a computer-mediated instructor in-the-loop virtual training for firefighter commanders in an attempt to raise the engagement and arousal of the users. Computer-mediated training for crew commanders should provide a sensory experience that is sufficiently intense to make the training viable and effective.…

  4. Triggering and data analysis for the VIRGO experiment on the APEmille parallel computer

    Science.gov (United States)

    Beccaria, M.; Cella, G.; Ciampa, A.; Cuoco, E.; Curci, G.; Vicerè, A.

    1997-03-01

    We give a brief resume of some possible strategies for the real-time data analisys in the framework of the VIRGO experiment. We discuss in particular the utility and the feasibility of their implementation on parallel computers, focusing on the APEmille SIMD machine. We evaluate the computational power required in two cases: the monitoring of known pulsars and the detection of binary coalescences.

  5. Computer-assisted training experiment used in the field of thermal energy production (EDF)

    International Nuclear Information System (INIS)

    Felgines, R.

    1982-01-01

    In 1981, the EDF carried out an experiment with computer-assisted training (EAO). This new approach, which continued until June 1982, involved about 700 employees all of whom operated nuclear power stations. The different stages of this experiment and the lessons which can be drawn from it are given the lessons were of a positive nature and make it possible to envisage complete coverage of all nuclear power stations by computer-assisted training within a very short space of time [fr

  6. Multislice computed tomographic coronary angiography: experience in a UK centre

    International Nuclear Information System (INIS)

    Morgan-Hughes, G.J.; Marshall, A.J.; Roobottom, C.A.

    2003-01-01

    AIM: To evaluate the technique of coronary angiography with retrospectively electrocardiogram (ECG)-gated four-slice helical computed tomography (CT). MATERIALS AND METHODS: Within 1 month of undergoing routine day-case diagnostic coronary angiography, 30 consecutive patients also underwent retrospectively ECG-gated multislice CT coronary angiography. This enabled direct comparison of seven segments of proximal and mid-coronary artery for each patient by two blinded assessors. Each segment of coronary artery from the multislice CT image was evaluated initially for 'assessability' and those segments deemed assessable were subsequently investigated for the presence or absence of a significantly (n=70%) stenotic lesion. RESULTS: Overall 68% of proximal and mid-coronary artery segments were assessable. The sensitivity and specificity of four-slice CT coronary angiography in assessable segments for detecting the presence or absence (n=70%) of stenoses were 72 and 86%, respectively. These results correspond to a positive predictive value of 53% and a 93% negative predictive value. If the 32% of non-assessable segments are added into the calculation then the sensitivity and specificity fall to 49 and 66%, respectively. CONCLUSION: Although multislice CT coronary angiography is a promising technique, the overall assessability and diagnostic accuracy of four-slice CT acquisition is not sufficient to justify routine clinical use. Further, evaluation should investigate the benefit of the reduction in temporal and spatial resolution offered by 16 and 32 slice acquisition

  7. Experiences of Using Automated Assessment in Computer Science Courses

    Directory of Open Access Journals (Sweden)

    John English

    2015-10-01

    Full Text Available In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students type in their own answers. Students were allowed to correct errors based on feedback provided by the system and resubmit their answers. A total of 141 students were surveyed to assess their opinions of this approach, and we analysed their responses. Analysis of the questionnaire showed a low correlation between questions, indicating the statistical independence of the individual questions. As a whole, student feedback on using Checkpoint was very positive, emphasizing the benefits of multiple attempts, impartial marking, and a quick turnaround time for submissions. Many students said that Checkpoint gave them confidence in learning and motivation to practise. Students also said that the detailed feedback that Checkpoint generated when their programs failed helped them understand their mistakes and how to correct them.

  8. Monitoring of computing resource utilization of the ATLAS experiment

    CERN Document Server

    Rousseau, D; The ATLAS collaboration; Vukotic, I; Aidel, O; Schaffer, RD; Albrand, S

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  9. Experiences using SciPy for computer vision research

    Energy Technology Data Exchange (ETDEWEB)

    Eads, Damian R [Los Alamos National Laboratory; Rosten, Edward J [Los Alamos National Laboratory

    2008-01-01

    SciPy is an effective tool suite for prototyping new algorithms. We share some of our experiences using it for the first time to support our research in object detection. SciPy makes it easy to integrate C code, which is essential when algorithms operating on large data sets cannot be vectorized. The universality of Python, the language in which SciPy was written, gives the researcher access to a broader set of non-numerical libraries to support GUI development, interface with databases, manipulate graph structures. render 3D graphics, unpack binary files, etc. Python's extensive support for operator overloading makes SciPy's syntax as succinct as its competitors, MATLAB, Octave, and R. More profoundly, we found it easy to rework research code written with SciPy into a production application, deployable on numerous platforms.

  10. The TESS [Tandem Experiment Simulation Studies] computer code user's manual

    International Nuclear Information System (INIS)

    Procassini, R.J.

    1990-01-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs

  11. Automatization of physical experiments on-line with the MINSK-32 computer

    International Nuclear Information System (INIS)

    Fefilov, B.V.; Mikhushkin, A.V.; Morozov, V.M.; Sukhov, A.M.; Chelnokov, L.P.

    1978-01-01

    The system for data acquisition and processing of complex multi-dimensional experiments is described. The system includes the autonomous modules in the CAMAC standard, the NAIRI-4 small computer and the MINSK-32 base computer. The NAIRI-4 computer effects preliminary storage, data processing and experiment control. Its software includes the microprogram software of the NAIRI-4 computer, the software of the NAIRI-2 computer, the software of the PDP-11 computer, the technological software on the Es computers. A crate controller and a display driver are connected to the main channel for the operation of the NAIRI-4 computer on line with experimental devices. An input-output channel commutator, which transforms the MINSK-32 computer levels to the TTL levels and vice versa, was developed to enlarge the possibilities of the connection of the measurement modules to the MINSK-32 computer. The graphic display on the basis of the HP-1300A monitor with a light pencil is used for highly effective spectrum processing

  12. A cerebellar neuroprosthetic system: computational architecture and in vivo experiments

    Directory of Open Access Journals (Sweden)

    Ivan eHerreros Alonso

    2014-05-01

    Full Text Available Emulating the input-output functions performed by a brain structure opens the possibility for developing neuro-prosthetic systems that replace damaged neuronal circuits. Here, we demonstrate the feasibility of this approach by replacing the cerebellar circuit responsible for the acquisition and extinction of motor memories. Specifically, we show that a rat can undergo acquisition, retention and extinction of the eye-blink reflex even though the biological circuit responsible for this task has been chemically inactivated via anesthesia. This is achieved by first developing a computational model of the cerebellar microcircuit involved in the acquisition of conditioned reflexes and training it with synthetic data generated based on physiological recordings. Secondly, the cerebellar model is interfaced with the brain of an anesthetized rat, connecting the model's inputs and outputs to afferent and efferent cerebellar structures. As a result, we show that the anesthetized rat, equipped with our neuro-prosthetic system, can be classically conditioned to the acquisition of an eye-blink response. However, non-stationarities in the recorded biological signals limit the performance of the cerebellar model. Thus, we introduce an updated cerebellar model and validate it with physiological recordings showing that learning becomes stable and reliable. The resulting system represents an important step towards replacing lost functions of the central nervous system via neuro-prosthetics, obtained by integrating a synthetic circuit with the afferent and efferent pathways of a damaged brain region. These results also embody an early example of science-based medicine, where on the one hand the neuro-prosthetic system directly validates a theory of cerebellar learning that informed the design of the system, and on the other one it takes a step towards the development of neuro-prostheses that could recover lost learning functions in animals and, in the longer term

  13. Computer network that assists in the planning, execution and evaluation of in-reactor experiments

    International Nuclear Information System (INIS)

    Bauer, T.H.; Froehle, P.H.; August, C.; Baldwin, R.D.; Johanson, E.W.; Kraimer, M.R.; Simms, R.; Klickman, A.E.

    1985-01-01

    For over 20 years complex, in-reactor experiments have been performed at Argonne National Laboratory (ANL) to investigate the performance of nuclear reactor fuel and to support the development of large computer codes that address questions of reactor safety in full-scale plants. Not only are computer codes an important end-product of the research, but computer analysis is also involved intimately at most stages of experiment planning, data reduction, and evaluation. For instance, many experiments are of sufficiently long duration or, if they are of brief duration, occur in such a purposeful sequence that need for speedy availability of on-line data is paramount. This is made possible most efficiently by computer assisted displays and evaluation. A purposeful linking of main-frame, mini, and micro computers has been effected over the past eight years which greatly enhances the speed with which experimental data are reduced to useful forms and applied to the relevant technological issues. This greater efficiency in data management led also to improvements in the planning and execution of subsequent experiments. Raw data from experiments performed at INEL is stored directly on disk and tape with the aid of minicomputers. Either during or shortly after an experiment, data may be transferred, via a direct link, to the Illinois offices of ANL where the data base is stored on a minicomputer system. This Idaho-to-Illinois link has both enhanced experiment performance and allowed rapid dissemination of results

  14. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  15. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  16. Inquiry Based-Computational Experiment, Acquisition of Threshold Concepts and Argumentation in Science and Mathematics Education

    Science.gov (United States)

    Psycharis, Sarantos

    2016-01-01

    Computational experiment approach considers models as the fundamental instructional units of Inquiry Based Science and Mathematics Education (IBSE) and STEM Education, where the model take the place of the "classical" experimental set-up and simulation replaces the experiment. Argumentation in IBSE and STEM education is related to the…

  17. Professors' and students' perceptions and experiences of computational simulations as learning tools

    Science.gov (United States)

    Magana de Leon, Alejandra De Jesus

    Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning

  18. Prescription of oral anticoagulation for patients with atrial fibrillation and previous hospitalization in a cardiology department. Experience in actual practice in a tertiary hospital.

    Science.gov (United States)

    Fabregat-Andrés, Ó; Cubillos-Arango, A; Chacón-Hernández, N; Montagud, V; Morell, S; Fácila, L

    2015-01-01

    Atrial fibrillation is the main reason for oral anticoagulation in our community. New oral anticoagulants (NOACs) overcome the disadvantages of vitamin K antagonists (VKAs), although there are scarce data on its use in our community. The aim of our study was to assess the use of NOACs and anticoagulation control using VKA as measured by the time within the therapeutic range (TTR) in an actual clinical scenario. A retrospective cohort analysis was conducted of 816 patients admitted to cardiology over a period of 3 years, with a diagnosis of atrial fibrillation and anticoagulant treatment at discharge. We assessed the percentage of patients prescribed NOACs and the TTR with VKA. We compared safety and efficacy events during the 15-month follow-up among the patients prescribed NOAC, those prescribed VKA with a good TTR and those with a poor TTR. The percentage of patients prescribed NOAC was 7.6%. Serial INR measurements found that 71.3% of patients had a poor TTR. Although the groups were not comparable, a higher incidence of the combined event was observed in those treated with VKA and a poor TTR compared with those prescribed NOAC (p=.01). For patients with a previous hospitalization in cardiology in a tertiary hospital and a diagnosis of atrial fibrillation, the rate of NOAC prescription is low, and the TTR with VKA was poor. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  19. PREVIOUS SECOND TRIMESTER ABORTION

    African Journals Online (AJOL)

    PNLC

    PREVIOUS SECOND TRIMESTER ABORTION: A risk factor for third trimester uterine rupture in three ... for accurate diagnosis of uterine rupture. KEY WORDS: Induced second trimester abortion - Previous uterine surgery - Uterine rupture. ..... scarred uterus during second trimester misoprostol- induced labour for a missed ...

  20. Explaining Research Utilization Among 4-H Faculty, Staff, and Volunteers: The Role of Self-Efficacy, Learning Goal Orientation, Training, and Previous Experience

    Directory of Open Access Journals (Sweden)

    Julianne Tillman

    2014-06-01

    Full Text Available An investigation of factors that facilitate the utilization of research evidence among faculty, staff, and volunteers in the 4-H Youth Development Program is presented in this paper. Participants (N= 368; 86 4-H faculty, 153 staff, and 129 volunteers represented 35 states; structural equation modeling was utilized in the analyses. Results of the path analysis explained 56% of variance in research utilization and 28% in research utilization self-efficacy. Among the factors impacting research utilization, self-efficacy played the most important role. In turn, self-efficacy for research utilization was positively influenced by participants’ learning goal orientation, frequency of 4-H training during the last 12 months, education in research-related areas, and investigative career interests. In addition, 4-H staff who were exposed to research at higher levels reported higher research utilization self-efficacy. The findings reinforce the importance of fostering research utilization self-efficacy among 4-H faculty, staff, and volunteers. Among the suggestions presented are regular 4-H training opportunities and on-going exposure to program evaluation and program improvement experiences.

  1. Influence of previous experience on the preference, food utilization and performance of Ascia monuste orseis wild larvae (Godart) (Lepidoptera: Pieridae) for three different hosts.

    Science.gov (United States)

    Santana, A F K; Zucoloto, F S

    2011-01-01

    The exhaustion of food resources which occurs during the ontogenetic growth of Ascia monuste orseis (Godart) results in the dispersion of older larvae to nearby plants in order to complete their development, which might expose these animals to the nutritional variation of the hosts found. This study aimed to verify whether the food ingested in the beginning of the development influences the larvae host preference and whether the shift to a new host can affect the digestion and performance of A. monuste orseis, using two natural hosts: kale (Brassica oleracea var. acephala) and rocket (Eruca sativa), or kale and cabbage (B. oleracea var. capitata). Larvae were reared throughout their larval development on a single host or on two different hosts. When a host change was tested, larvae were reared for four instars on a host, and offered the other host plant in the fifth instar. Development time, percentage of pupation and emergence, pupal weight, fecundity and digestive indices were evaluated. The change in feeding preference for kale and for rocket in the fourth instar, when those were the original hosts, respectively, shows that prior experience plays a major role in food preference of immature A. monuste orseis. The shift can be beneficial for larval development, depending on the order of the hosts; in general, larvae fed on kale at the end of the development showed better performance. Our results presented strong evidence of a considerable phenotypic plasticity in A. monuste orseis for host preferences.

  2. Une Experience d'enseignement du francais par ordinateur (An Experiment in Teaching French by Computer).

    Science.gov (United States)

    Bougaieff, Andre; Lefebvre, France

    1986-01-01

    An experimental program for university summer students of French as a second language that provided a computer resource center and a variety of courseware, authoring aids, and other software for student use is described and the problems and advantages are discussed. (MSE)

  3. Evaluating user experience with respect to user expectations in brain-computer interface games

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, G.; Poel, Mannes; Müller-Putz, G.R.; Scherer, R.; Billinger, M.; Kreilinger, A.; Kaiser, V.; Neuper, C.

    Evaluating user experience (UX) with respect to previous experiences can provide insight into whether a product can positively aect a user's opinion about a technology. If it can, then we can say that the product provides a positive UX. In this paper we propose a method to assess the UX in BCI

  4. A simple computer-based method for performing and analyzing intracranial self-stimulation experiments in rats.

    Science.gov (United States)

    Kling-Petersen, T; Svensson, K

    1993-05-01

    Intracranial self-stimulation (ICSS) in the rat is a useful tool for studying the importance of various brain monoamines in positive reinforcement. The effects of compounds interacting with dopaminergic neurotransmission is measurable by studying the changes of reward thresholds. By computerisation of the analysis of these thresholds, standardisation and reproducibility is greatly enhanced. The use of an object-oriented programming language simplifies the programming of a specific application and it provides scientists without formal training in computer programming the means to create their own software. A system for the acquisition, execution, analysis and storage of ICSS experiments is described. The hardware is based on Apple Macintosh computers, interfaced to the test chambers and physiological stimulators using a plug-in card supporting A/D, D/A, digital I/O and timer functions. The software written in G (LabVIEW) provides the user with a graphically based 'Virtual Instrument' performing all aspect of the ICSS experiment. The software performs threshold analysis immediately after completion of the ICSS experiment, thereby greatly reducing the total time previously needed to evaluate these experiments. The graphical approach used in LabVIEW allows the programmer to make fast and simple alterations to suit different experimental problems.

  5. Computer-related assistive technology: satisfaction and experiences among users with disabilities.

    Science.gov (United States)

    Burton, Mary; Nieuwenhuijsen, Els R; Epstein, Marcy J

    2008-01-01

    Many people with disabilities use assistive technology devices (ATDs) for computer access. The specific focus of this exploratory study was (a) to assess the experiences, opinions, and satisfaction levels of 24 individuals with disabilities using computer-related ATDs; (b) to investigate their awareness of health risk factors related to computer usage; and (c) to examine the psychosocial impact of computer-related ATDs on users. Data were collected via telephone interviews with 24 individuals with physical disabilities who had experience using one or more ATDs. The Quebec User Evaluation with Assistive Technology instrument was used to evaluate users' satisfaction with ATDs in a number of dimensions, including their physical attributes. The Psychosocial Impact of Assistive Devices Scale measured the psychosocial impact (i.e., independence, competence, and adequacy) of an ATD on users. Additional questions were posed to gather information about user's opinions and experiences. Training appeared to be an important component for ATD users, many of whom preferred a setting to try out devices rather than group or individual training. Respondents with visual impairments revealed a higher level of adaptability versus those without visual impairments (p = .001). Additional research is needed to develop specific survey items focused on users of computer-related ATDs and the evaluation of the psychosocial impact of ATDs on computer users.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. Computer-controlled back scattering and sputtering-experiment using a heavy-ion-accelerator

    International Nuclear Information System (INIS)

    Becker, H.; Birnbaum, M.; Degenhardt, K.H.; Mertens, P.; Tschammer, V.

    1978-12-01

    Control and data acquisition of a PDP 11/40 computer and CAMAC instrumentation are reported for an experiment that has been developed to measure sputtering in yields and energy losses for heavy 100 - 300 keV ions in thin metal foils. Besides a quadrupole mass filter or a bending magnet, a multichannel analyser is coupled to the computer, so that also pulse height analysis can be performed under computer control. CAMAC instrumentation and measuring programs are built in a modular form to enable an easy application to other experimental problems. (orig.) 891 KBE/orig. 892 BRE

  8. Computer assisted treatments for image pattern data of laser plasma experiments

    International Nuclear Information System (INIS)

    Yaoita, Akira; Matsushima, Isao

    1987-01-01

    An image data processing system for laser-plasma experiments has been constructed. These image data are two dimensional images taken by X-ray, UV, infrared and visible light television cameras and also taken by streak cameras. They are digitized by frame memories. The digitized image data are stored in disk memories with the aid of a microcomputer. The data are processed by a host computer and stored in the files of the host computer and on magnetic tapes. In this paper, the over view of the image data processing system and some software for data handling in the host computer are reported. (author)

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. Comparison of Computational Results with a Low-g, Nitrogen Slosh and Boiling Experiment

    Science.gov (United States)

    Stewart, Mark; Moder, Jeff

    2015-01-01

    The proposed paper will compare a fluid/thermal simulation, in FLUENT, with a low-g, nitrogen slosh experiment. The French Space Agency, CNES, performed cryogenic nitrogen experiments in several zero gravity aircraft campaigns. The computational results have been compared with high-speed photographic data, pressure data, and temperature data from sensors on the axis of the cylindrically shaped tank. The comparison between these experimental and computational results is generally favorable: the initial temperature stratification is in good agreement, and the two-phase fluid motion is qualitatively captured.

  11. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    Energy Technology Data Exchange (ETDEWEB)

    Herner, K. [Fermilab; Alba Hernandex, A. F. [Fermilab; Bhat, S. [Fermilab; Box, D. [Fermilab; Boyd, J. [Fermilab; Di Benedetto, V. [Fermilab; Ding, P. [Fermilab; Dykstra, D. [Fermilab; Fattoruso, M. [Fermilab; Garzoglio, G. [Fermilab; Kirby, M. [Fermilab; Kreymer, A. [Fermilab; Levshina, T. [Fermilab; Mazzacane, A. [Fermilab; Mengel, M. [Fermilab; Mhashilkar, P. [Fermilab; Podstavkov, V. [Fermilab; Retzke, K. [Fermilab; Sharma, N. [Fermilab; Teheran, J. [Fermilab

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed

  12. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    Science.gov (United States)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called

  13. The Effect of Prior Experience with Computers, Statistical Self-Efficacy, and Computer Anxiety on Students' Achievement in an Introductory Statistics Course: A Partial Least Squares Path Analysis

    Science.gov (United States)

    Abd-El-Fattah, Sabry M.

    2005-01-01

    A Partial Least Squares Path Analysis technique was used to test the effect of students' prior experience with computers, statistical self-efficacy, and computer anxiety on their achievement in an introductory statistics course. Computer Anxiety Rating Scale and Current Statistics Self-Efficacy Scale were administered to a sample of 64 first-year…

  14. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    Science.gov (United States)

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  15. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  16. Previous Experience a Model of Practice UNAE

    OpenAIRE

    Ruiz, Ormary Barberi; Pesántez Palacios, María Dolores

    2017-01-01

    The statements presented in this article represents a preliminary version of the proposed model of pre-professional practices (PPP) of the National University of Education (UNAE) of Ecuador, an urgent institutional necessity is revealed in the descriptive analyzes conducted from technical support - administrative (reports, interviews, testimonials), pedagogical foundations of UNAE (curricular directionality, transverse axes in practice, career plan, approach and diagnostic examination as subj...

  17. Previous experiences shape adaptive mate preferences

    NARCIS (Netherlands)

    Fawcett, Tim W.; Bleay, Colin

    2009-01-01

    Existing models of mate choice assume that individuals have perfect knowledge of their own ability to attract a mate and can adjust their preferences accordingly. However, real animals will typically be uncertain of their own attractiveness. A potentially useful source of information on this is the

  18. FPGA Compute Acceleration for High-Throughput Data Processing in High-Energy Physics Experiments

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The upgrades of the four large experiments of the LHC at CERN in the coming years will result in a huge increase of data bandwidth for each experiment which needs to be processed very efficiently. For example the LHCb experiment will upgrade its detector 2019/2020 to a 'triggerless' readout scheme, where all of the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40MHz. This increases the data bandwidth from the detector down to the event filter farm to 40TBit/s, which must be processed to select the interesting proton-proton collisions for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered.    In the high performance computing sector more and more FPGA compute accelerators are being used to improve the compute performance and reduce the...

  19. Use of Tablet Computers to Promote Physical Therapy Students' Engagement in Knowledge Translation During Clinical Experiences

    Science.gov (United States)

    Loeb, Kathryn; Barbosa, Sabrina; Jiang, Fei; Lee, Karin T.

    2016-01-01

    Background and Purpose: Physical therapists strive to integrate research into daily practice. The tablet computer is a potentially transformational tool for accessing information within the clinical practice environment. The purpose of this study was to measure and describe patterns of tablet computer use among physical therapy students during clinical rotation experiences. Methods: Doctor of physical therapy students (n = 13 users) tracked their use of tablet computers (iPad), loaded with commercially available apps, during 16 clinical experiences (6-16 weeks in duration). Results: The tablets were used on 70% of 691 clinic days, averaging 1.3 uses per day. Information seeking represented 48% of uses; 33% of those were foreground searches for research articles and syntheses and 66% were for background medical information. Other common uses included patient education (19%), medical record documentation (13%), and professional communication (9%). The most frequently used app was Safari, the preloaded web browser (representing 281 [36.5%] incidents of use). Users accessed 56 total apps to support clinical practice. Discussion and Conclusions: Physical therapy students successfully integrated use of a tablet computer into their clinical experiences including regular activities of information seeking. Our findings suggest that the tablet computer represents a potentially transformational tool for promoting knowledge translation in the clinical practice environment. Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A127). PMID:26945431

  20. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  1. Evaluating the Relationship of Computer Literacy Training Competence and Nursing Experience to CPIS Resistance

    Science.gov (United States)

    Reese, Dorothy J.

    2012-01-01

    The purpose of this quantitative, descriptive/correlational project was to examine the relationship between the level of computer literacy, informatics training, nursing experience, and perceived competence in using computerized patient information systems (CPIS) and nursing resistance to using CPIS. The Nurse Computerized Patient Information…

  2. Computational Modeling of the Optical Rotation of Amino Acids: An "in Silico" Experiment for Physical Chemistry

    Science.gov (United States)

    Simpson, Scott; Autschbach, Jochen; Zurek, Eva

    2013-01-01

    A computational experiment that investigates the optical activity of the amino acid valine has been developed for an upper-level undergraduate physical chemistry laboratory course. Hybrid density functional theory calculations were carried out for valine to confirm the rule that adding a strong acid to a solution of an amino acid in the l…

  3. Solution of the Schrodinger Equation for a Diatomic Oscillator Using Linear Algebra: An Undergraduate Computational Experiment

    Science.gov (United States)

    Gasyna, Zbigniew L.

    2008-01-01

    Computational experiment is proposed in which a linear algebra method is applied to the solution of the Schrodinger equation for a diatomic oscillator. Calculations of the vibration-rotation spectrum for the HCl molecule are presented and the results show excellent agreement with experimental data. (Contains 1 table and 1 figure.)

  4. Preliminary analysis of the MER magnetic properties experiment using a computational fluid dynamics model

    DEFF Research Database (Denmark)

    Kinch, K.M.; Merrison, J.P.; Gunnlaugsson, H.P.

    2006-01-01

    Motivated by questions raised by the magnetic properties experiments on the NASA Mars Pathfinder and Mars Exploration Rover (MER) missions, we have studied in detail the capture of airborne magnetic dust by permanent magnets using a computational fluid dynamics (CFD) model supported by laboratory...

  5. Airflow in a World Exposition Pavilion Studied by Scale-Model Experiments and Computational Flud Dynamics

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    The ventilation design concept, model experiment results, two-dimensional computational fluid dynamics simulation, and on-site measurements are presented for the Danish Pavilion project at the 1992 World Exhibition in Seville. The paper gives a short project history for the building...

  6. Technology Readiness, Internet Self-Efficacy and Computing Experience of Professional Accounting Students

    Science.gov (United States)

    Lai, Ming-Ling

    2008-01-01

    Purpose: This study aims to assess the state of technology readiness of professional accounting students in Malaysia, to examine their level of internet self-efficacy, to assess their prior computing experience, and to explore if they are satisfied with the professional course that they are pursuing in improving their technology skills.…

  7. Using a Computer Microphone Port to Study Circular Motion: Proposal of a Secondary School Experiment

    Science.gov (United States)

    Soares, A. A.; Borcsik, F. S.

    2016-01-01

    In this work we present an inexpensive experiment proposal to study the kinematics of uniform circular motion in a secondary school. We used a PC sound card to connect a homemade simple sensor to a computer and used the free sound analysis software "Audacity" to record experimental data. We obtained quite good results even in comparison…

  8. Evaluating User Experience in a Selection Based Brain-Computer Interface Game: A Comparative Study

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, Gido; Poel, Mannes; Coutinho Anacleto, Junia; Fels, Sidney; Graham, Nicholas; Kapralos, Bill; Saif El-Nasr, Magy; Stanley, Kevin

    2011-01-01

    In human-computer interaction, it is important to offer the users correct modalities for particular tasks and situations. Unless the user has the suitable modality for a task, neither task performance nor user experience can be optimised. The aim of this study is to assess the appropriateness of

  9. My Program Is Ok--Am I? Computing Freshmen's Experiences of Doing Programming Assignments

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This article provides insight into how computing majors experience the process of doing programming assignments in their first programming course. This grounded theory study sheds light on the various processes and contexts through which students constantly assess their self-efficacy as a programmer. The data consists of a series of four…

  10. Integrating IS Curriculum Knowledge through a Cluster-Computing Project--A Successful Experiment

    Science.gov (United States)

    Kitchens, Fred L.; Sharma, Sushil K.; Harris, Thomas

    2004-01-01

    MIS curricula in business schools are challenged to provide MIS courses that give students a strong practical understanding of the basic technologies, while also providing enough hands-on experience to solve real life problems. As an experimental capstone MIS course, the authors developed a cluster-computing project to expose business students to…

  11. Children's Experiences of Completing a Computer-Based Violence Survey: Ethical Implications

    Science.gov (United States)

    Ellonen, Noora; Poso, Tarja

    2011-01-01

    This article aims to contribute to the discussion about the ethics of research on children when studying sensitive issues such as violence. The empirical analysis is based on the accounts given by children (11 377) who completed a computer-based questionnaire about their experiences of violence ("The Finnish Child Victim Survey 2008")…

  12. Methods of physical experiment and installation automation on the base of computers

    International Nuclear Information System (INIS)

    Stupin, Yu.V.

    1983-01-01

    Peculiarities of using computers for physical experiment and installation automation are considered. Systems for data acquisition and processing on the base of microprocessors, micro- and mini-computers, CAMAC equipment and real time operational systems as well as systems intended for automation of physical experiments on accelerators and installations of laser thermonuclear fusion and installations for plasma investigation are dpscribed. The problems of multimachine complex and multi-user system, arrangement, development of automated systems for collective use, arrangement of intermachine data exchange and control of experimental data base are discussed. Data on software systems used for complex experimental data processing are presented. It is concluded that application of new computers in combination with new possibilities provided for users by universal operational systems essentially exceeds efficiency of a scientist work

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. Computer navigation experience in hip resurfacing improves femoral component alignment using a conventional jig.

    Science.gov (United States)

    Morison, Zachary; Mehra, Akshay; Olsen, Michael; Donnelly, Michael; Schemitsch, Emil

    2013-11-01

    The use of computer navigation has been shown to improve the accuracy of femoral component placement compared to conventional instrumentation in hip resurfacing. Whether exposure to computer navigation improves accuracy when the procedure is subsequently performed with conventional instrumentation without navigation has not been explored. We examined whether femoral component alignment utilizing a conventional jig improves following experience with the use of imageless computer navigation for hip resurfacing. Between December 2004 and December 2008, 213 consecutive hip resurfacings were performed by a single surgeon. The first 17 (Cohort 1) and the last 9 (Cohort 2) hip resurfacings were performed using a conventional guidewire alignment jig. In 187 cases, the femoral component was implanted using the imageless computer navigation. Cohorts 1 and 2 were compared for femoral component alignment accuracy. All components in Cohort 2 achieved the position determined by the preoperative plan. The mean deviation of the stem-shaft angle (SSA) from the preoperatively planned target position was 2.2° in Cohort 2 and 5.6° in Cohort 1 (P = 0.01). Four implants in Cohort 1 were positioned at least 10° varus compared to the target SSA position and another four were retroverted. Femoral component placement utilizing conventional instrumentation may be more accurate following experience using imageless computer navigation.

  16. Computer navigation experience in hip resurfacing improves femoral component alignment using a conventional jig

    Directory of Open Access Journals (Sweden)

    Zachary Morison

    2013-01-01

    Full Text Available Background:The use of computer navigation has been shown to improve the accuracy of femoral component placement compared to conventional instrumentation in hip resurfacing. Whether exposure to computer navigation improves accuracy when the procedure is subsequently performed with conventional instrumentation without navigation has not been explored. We examined whether femoral component alignment utilizing a conventional jig improves following experience with the use of imageless computer navigation for hip resurfacing. Materials and Methods:Between December 2004 and December 2008, 213 consecutive hip resurfacings were performed by a single surgeon. The first 17 (Cohort 1 and the last 9 (Cohort 2 hip resurfacings were performed using a conventional guidewire alignment jig. In 187 cases, the femoral component was implanted using the imageless computer navigation. Cohorts 1 and 2 were compared for femoral component alignment accuracy. Results:All components in Cohort 2 achieved the position determined by the preoperative plan. The mean deviation of the stem-shaft angle (SSA from the preoperatively planned target position was 2.2° in Cohort 2 and 5.6° in Cohort 1 ( P = 0.01. Four implants in Cohort 1 were positioned at least 10° varus compared to the target SSA position and another four were retroverted. Conclusions: Femoral component placement utilizing conventional instrumentation may be more accurate following experience using imageless computer navigation.

  17. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  18. Application of a personal computer in a high energy physics experiment

    International Nuclear Information System (INIS)

    Petta, P.

    1987-04-01

    UA1 is a detector block at the CERN Super Synchrotron Collider, MacVEE is Micro computer applied to the Control of VME Electronic Equipment, a software development system for the data readout system and for the implementation of the user interface of the experiment control. A commercial personal computer is used. Examples of applications are the Data Acquisition Console, the Scanner Desc equipment and the AMERICA Ram Disks codes. Further topics are the MacUA1 development system for M68K-VME codes and an outline of the future MacVEE System Supervisor. 23 refs., 10 figs., 3 tabs. (qui)

  19. My program is ok - am I? Computing freshmen's experiences of doing programming assignments

    Science.gov (United States)

    Kinnunen, Päivi; Simon, Beth

    2012-03-01

    This article provides insight into how computing majors experience the process of doing programming assignments in their first programming course. This grounded theory study sheds light on the various processes and contexts through which students constantly assess their self-efficacy as a programmer. The data consists of a series of four interviews conducted with a purposeful sample of nine computer science majors in a research intensive state university in the United States. Use of the constant comparative method elicited two forms of results. First, we identified six stages of doing a programming assignment. Analysis captures the dimensional variation in students' experiences with programming assignments on a detailed level. We identified a core category resulting from students' reflected emotions in conjunction with self-efficacy assessment. We provide a descriptive model of how computer science majors build their self-efficacy perceptions, reported via four narratives. Our key findings are that some students reflect negative views of their efficacy, even after having a positive programming experience and that in other situations, students having negative programming experiences still have a positive outlook on their efficacy. We consider these findings in light of possible languages and support structures for introductory programming courses.

  20. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  1. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-02-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing) that students apply to study how lakes around the globe are experiencing the effects of climate change. In the module, students develop hypotheses about the effects of different climate scenarios on lakes and then test their hypotheses using hundreds of model simulations. We taught the module in a 4-hour workshop and found that participation in the module significantly increased both undergraduate and graduate students' understanding about climate change effects on lakes. Moreover, participation in the module also significantly increased students' perceived experience level in using different software, technologies, and modeling tools. By embedding modeling in an environmental science context, non-computer science students were able to successfully use and master technologies that they had previously never been exposed to. Overall, our findings suggest that modeling is a powerful tool for catalyzing student learning on the effects of climate change.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. The high-rate data challenge: computing for the CBM experiment

    Science.gov (United States)

    Friese, V.; CBM Collaboration

    2017-10-01

    The Compressed Baryonic Matter experiment (CBM) is a next-generation heavy-ion experiment to be operated at the FAIR facility, currently under construction in Darmstadt, Germany. A key feature of CBM is very high interaction rate, exceeding those of contemporary nuclear collision experiments by several orders of magnitude. Such interaction rates forbid a conventional, hardware-triggered readout; instead, experiment data will be freely streaming from self-triggered front-end electronics. In order to reduce the huge raw data volume to a recordable rate, data will be selected exclusively on CPU, which necessitates partial event reconstruction in real-time. Consequently, the traditional segregation of online and offline software vanishes; an integrated on- and offline data processing concept is called for. In this paper, we will report on concepts and developments for computing for CBM as well as on the status of preparations for its first physics run.

  4. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  5. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  6. Results of computer network experiment via the Japanese communication satellite CS - Performance evaluation of communication protocols

    Science.gov (United States)

    Ito, A.; Kakinuma, Y.; Uchida, K.; Matsumoto, K.; Takahashi, H.

    1984-03-01

    Computer network experiments have been performed by using the Japanese communication satellite CS. The network is of a centralized (star) type, consisting of one center station and many user stations. The protocols are determined taking into consideration the long round trip delay of a satellite channel. This paper treats the communication protocol aspects of the experiments. Performances of the burst level and the link protocols (which correspond nearly to data link layer of OSI 7 layer model) are evaluated. System performances of throughput, delay, link level overhead are measured by using the statistically generated traffic.

  7. Data processing with PC-9801 micro-computer for HCN laser scattering experiments

    International Nuclear Information System (INIS)

    Iwasaki, T.; Okajima, S.; Kawahata, K.; Tetsuka, T.; Fujita, J.

    1986-09-01

    In order to process the data of HCN laser scattering experiments, a micro-computer software has been developed and applied to the measurements of density fluctuations in the JIPP T-IIU tokamak plasma. The data processing system consists of a spectrum analyzer, SM-2100A Signal Analyzer (IWATSU ELECTRIC CO., LTD.), PC-9801m3 micro-computer, a CRT-display and a dot-printer. The output signals from the spectrum analyzer are A/D converted, and stored on a mini-floppy-disk equipped to the signal analyzer. The software to process the data is composed of system-programs and several user-programs. The real time data processing is carried out for every shot of plasma at 4 minutes interval by the micro-computer connected with the signal analyzer through a GP-IB interface. The time evolutions of the frequency spectrum of the density fluctuations are displayed on the CRT attached to the micro-computer and printed out on a printer-sheet. In the case of the data processing after experiments, the data stored on the floppy-disk of the signal analyzer are read out by using a floppy-disk unit attached to the micro-computer. After computation with the user-programs, the results, such as monitored signal, frequency spectra, wave number spectra and the time evolutions of the spectrum, are displayed and printed out. In this technical report, the system, the software and the directions for use are described. (author)

  8. The experience of agency in human-computer interactions: a review.

    Science.gov (United States)

    Limerick, Hannah; Coyle, David; Moore, James W

    2014-01-01

    The sense of agency is the experience of controlling both one's body and the external environment. Although the sense of agency has been studied extensively, there is a paucity of studies in applied "real-life" situations. One applied domain that seems highly relevant is human-computer-interaction (HCI), as an increasing number of our everyday agentive interactions involve technology. Indeed, HCI has long recognized the feeling of control as a key factor in how people experience interactions with technology. The aim of this review is to summarize and examine the possible links between sense of agency and understanding control in HCI. We explore the overlap between HCI and sense of agency for computer input modalities and system feedback, computer assistance, and joint actions between humans and computers. An overarching consideration is how agency research can inform HCI and vice versa. Finally, we discuss the potential ethical implications of personal responsibility in an ever-increasing society of technology users and intelligent machine interfaces.

  9. Computer-intensive simulation of solid-state NMR experiments using SIMPSON.

    Science.gov (United States)

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Early experiences of computer‐aided assessment and administration when teaching computer programming

    OpenAIRE

    Benford, Steve; Burke, Edmund; Foxley, Eric; Gutteridge, Neil; Zin, Abdullah

    1993-01-01

    This paper describes early experiences with the Ceilidh system currently being piloted at over 30 institutions of higher education. Ceilidh is a course‐management system for teaching computer programming whose core is an auto‐assessment facility. This facility automatically marks students programs from a range of perspectives, and may be used in an iterative manner, enabling students to work towards a target level of attainment. Ceilidh also includes extensive course‐administration and progre...

  11. Computer Simulation and Field Experiment for Downlink Multiuser MIMO in Mobile WiMAX System.

    Science.gov (United States)

    Yamaguchi, Kazuhiro; Nagahashi, Takaharu; Akiyama, Takuya; Matsue, Hideaki; Uekado, Kunio; Namera, Takakazu; Fukui, Hiroshi; Nanamatsu, Satoshi

    2015-01-01

    The transmission performance for a downlink mobile WiMAX system with multiuser multiple-input multiple-output (MU-MIMO) systems in a computer simulation and field experiment is described. In computer simulation, a MU-MIMO transmission system can be realized by using the block diagonalization (BD) algorithm, and each user can receive signals without any signal interference from other users. The bit error rate (BER) performance and channel capacity in accordance with modulation schemes and the number of streams were simulated in a spatially correlated multipath fading environment. Furthermore, we propose a method for evaluating the transmission performance for this downlink mobile WiMAX system in this environment by using the computer simulation. In the field experiment, the received power and downlink throughput in the UDP layer were measured on an experimental mobile WiMAX system developed in Azumino City in Japan. In comparison with the simulated and experimented results, the measured maximum throughput performance in the downlink had almost the same performance as the simulated throughput. It was confirmed that the experimental mobile WiMAX system for MU-MIMO transmission successfully increased the total channel capacity of the system.

  12. An Analysis of Creative Process Learning in Computer Game Activities Through Player Experiences

    Directory of Open Access Journals (Sweden)

    Wilawan Inchamnan

    2016-09-01

    Full Text Available This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL. A behavior analysis for measuring the creative potential of computer game activities and learning outcomes is described. Creative components were measured by examining task motivation and domain-relevant and creativity-relevant skill factors. The research approach applied heuristic checklists in the field of gameplay to analyze the stage of player activities involved in the performance of the task and to examine player experiences with the Player Experience of Need Satisfaction (PENS survey. Player experiences were influenced by competency, autonomy, intuitive controls, relatedness and presence. This study examines the impact of these activities on the player experience for evaluating learning outcomes through school records. The study is designed to better understand the creative potential of people who are engaged in learning knowledge and skills during the course while playing video games. The findings show the creative potential that occurred to yield levels of creative performance within game play activities to support learning. The anticipated outcome is knowledge on how video games foster creative thinking as an overview of the Creative Potential of Learning Model (CPLN. CPLN clearly describes the interrelationships between principles of learning and creative potential, the interpretation of the results is indispensable.

  13. Virtual machines & volunteer computing: Experience from LHC@Home: Test4Theory project

    CERN Document Server

    Lombraña González, Daniel; Blomer, Jakob; Buncic, Predrag; Harutyunyan, Artem; Marquina, Miguel; Segal, Ben; Skands, Peter; Karneyeu, Anton

    2012-01-01

    Volunteer desktop grids are nowadays becoming more and more powerful thanks to improved high end components: multi-core CPUs, larger RAM memories and hard disks, better network connectivity and bandwidth, etc. As a result, desktop grid systems can run more complex experiments or simulations, but some problems remain: the heterogeneity of hardware architectures and software (library dependencies, code length, big repositories, etc.) make it very difficult for researchers and developers to deploy and maintain a software stack for all the available platforms. In this paper, the employment of virtualization is shown to be the key to solve these problems. It provides a homogeneous layer allowing researchers to focus their efforts on running their experiments. Inside virtual custom execution environments, researchers can control and deploy very complex experiments or simulations running on heterogeneous grids of high-end computers. The following work presents the latest results from CERN’s LHC@home Test4Theory p...

  14. THE IT GENDER GAP: Experience, Motivation and Differences in Undergraduate Studies of Computer Science

    Directory of Open Access Journals (Sweden)

    Ivanović MIRJANA

    2011-04-01

    Full Text Available This paper reports on progress and conclusions of two-year research of gender issues in studying computer science at Department of Mathematics and Informatics, Faculty of Science, University of Novi Sad. Using statistics on data gathered by a survey, the work presented here focused on identifying, understanding, and correlating both female and male students’ performance, professional motivation, ambition, confidence level and attitudes towards differences, learning quality and nature of the field. Results show agreement with theoretical predictions and significant improvement over previous efforts; providing profound implications for future studies of gender-sensitive IT education in the region and beyond.

  15. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2017-11-01

    Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  16. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    International Nuclear Information System (INIS)

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  17. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation?

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on a strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers. (orig.)

  18. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers

  19. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    Energy Technology Data Exchange (ETDEWEB)

    Sommer, C. M., E-mail: christof.sommer@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Fritz, S., E-mail: stefan.fritz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Vollherbst, D., E-mail: dominikvollherbst@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Zelzer, S., E-mail: s.zelzer@dkfz-heidelberg.de [German Cancer Research Center (dkfz), Medical and Biological Informatics (Germany); Wachter, M. F., E-mail: fredericwachter@googlemail.com; Bellemann, N., E-mail: nadine.bellemann@med.uni-heidelberg.de; Gockner, T., E-mail: theresa.gockner@med.uni-heidelberg.de; Mokry, T., E-mail: theresa.mokry@med.uni-heidelberg.de; Schmitz, A., E-mail: anne.schmitz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Aulmann, S., E-mail: sebastian.aulmann@mail.com [University Hospital Heidelberg, Department of General Pathology (Germany); Stampfl, U., E-mail: ulrike.stampfl@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Pereira, P., E-mail: philippe.pereira@slk-kliniken.de [SLK Kliniken Heilbronn GmbH, Clinic for Radiology, Minimally-invasive Therapies and Nuclear Medicine (Germany); Kauczor, H. U., E-mail: hu.kauczor@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Werner, J., E-mail: jens.werner@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Radeleff, B. A., E-mail: boris.radeleff@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany)

    2015-02-15

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm{sup 3}, and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm{sup 3}, and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver.

  20. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    International Nuclear Information System (INIS)

    Sommer, C. M.; Fritz, S.; Vollherbst, D.; Zelzer, S.; Wachter, M. F.; Bellemann, N.; Gockner, T.; Mokry, T.; Schmitz, A.; Aulmann, S.; Stampfl, U.; Pereira, P.; Kauczor, H. U.; Werner, J.; Radeleff, B. A.

    2015-01-01

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm 3 , and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm 3 , and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver

  1. The P4 Parallel Programming System, the Linda Environment, and Some Experiences with Parallel Computation

    Directory of Open Access Journals (Sweden)

    Allan R. Larrabee

    1993-01-01

    Full Text Available The first digital computers consisted of a single processor acting on a single stream of data. In this so-called "von Neumann" architecture, computation speed is limited mainly by the time required to transfer data between the processor and memory. This limiting factor has been referred to as the "von Neumann bottleneck". The concern that the miniaturization of silicon-based integrated circuits will soon reach theoretical limits of size and gate times has led to increased interest in parallel architectures and also spurred research into alternatives to silicon-based implementations of processors. Meanwhile, sequential processors continue to be produced that have increased clock rates and an increase in memory locally available to a processor, and an increase in the rate at which data can be transferred to and from memories, networks, and remote storage. The efficiency of compilers and operating systems is also improving over time. Although such characteristics limit maximum performance, a large improvement in the speed of scientific computations can often be achieved by utilizing more efficient algorithms, particularly those that support parallel computation. This work discusses experiences with two tools for large grain (or "macro task" parallelism.

  2. Enabling the ATLAS Experiment at the LHC for High Performance Computing

    CERN Document Server

    AUTHOR|(CDS)2091107; Ereditato, Antonio

    In this thesis, I studied the feasibility of running computer data analysis programs from the Worldwide LHC Computing Grid, in particular large-scale simulations of the ATLAS experiment at the CERN LHC, on current general purpose High Performance Computing (HPC) systems. An approach for integrating HPC systems into the Grid is proposed, which has been implemented and tested on the „Todi” HPC machine at the Swiss National Supercomputing Centre (CSCS). Over the course of the test, more than 500000 CPU-hours of processing time have been provided to ATLAS, which is roughly equivalent to the combined computing power of the two ATLAS clusters at the University of Bern. This showed that current HPC systems can be used to efficiently run large-scale simulations of the ATLAS detector and of the detected physics processes. As a first conclusion of my work, one can argue that, in perspective, running large-scale tasks on a few large machines might be more cost-effective than running on relatively small dedicated com...

  3. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  4. FOREIGN AND DOMESTIC EXPERIENCE OF INTEGRATING CLOUD COMPUTING INTO PEDAGOGICAL PROCESS OF HIGHER EDUCATIONAL ESTABLISHMENTS

    Directory of Open Access Journals (Sweden)

    Nataliia A. Khmil

    2016-01-01

    Full Text Available In the present article foreign and domestic experience of integrating cloud computing into pedagogical process of higher educational establishments (H.E.E. has been generalized. It has been stated that nowadays a lot of educational services are hosted in the cloud, e.g. infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. The peculiarities of implementing cloud technologies by H.E.E. in Ukraine and abroad have been singled out; the products developed by the leading IT companies for using cloud computing in higher education system, such as Microsoft for Education, Google Apps for Education and Amazon AWS Educate have been reviewed. The examples of concrete types, methods and forms of learning and research work based on cloud services have been provided.

  5. In the land of the dinosaurs, how to survive experience with building of midrange computing cluster

    International Nuclear Information System (INIS)

    Chevel, A.E.; Lauret, J.

    2001-01-01

    The authors discuss how to put into operation a midrange computing cluster for the Nuclear Chemistry Group (NCG) of the Stage University of New York at STONY Brook (SUNY-SB). The NCG is part and one of the collaborators within the RHIC/Phenix experiment located at the Brookhaven National Laboratory (BNL). The Phenix detector system produces about half a PB (or 500 TB) of data a year and our goal was to provide to this remote collaborating facility the means to be part of the analysis process. The computing installation was put into operation at the beginning of the year 2000. The cluster consists of 32 peripheral machines running under Linux and central server Alpha 4100 under Digital Unix 4.0f (formally True Unix 64). The realization process is under discussion

  6. [Computer-based organization and documentation in orthopedics. A 5-year experience].

    Science.gov (United States)

    Hess, T; Deimel, D; Fischer, R; Duchow, J

    1999-03-01

    In the orthopedic department of the University Hospital Homburg/Saar, we use since 1993 a computer-based system for clinics organisation and documentation of operations. Hardware consists of DOS/Windows PC's in a Novell-network. Our software is a combination of database-system for managing patient-data and a special coding program for ICD and IKPM-digits. Our experience shows that computer assisted clinic-management is an effective tool to help the surgeon in planning and documentation. Until now, we used the system for 31,500 patients and 8500 operations. A flexible software can meet the requirements both of the surgeons and administration. Moreover, in the University hospital Homburg/Saar, the different departments are linked by an Intranet with connection to other scientific networks and the Internet.

  7. Computer-aided analysis for the Mechanics of Granular Materials (MGM) experiment, part 2

    Science.gov (United States)

    Parker, Joey K.

    1987-01-01

    Computer vision based analysis for the MGM experiment is continued and expanded into new areas. Volumetric strains of granular material triaxial test specimens have been measured from digitized images. A computer-assisted procedure is used to identify the edges of the specimen, and the edges are used in a 3-D model to estimate specimen volume. The results of this technique compare favorably to conventional measurements. A simplified model of the magnification caused by diffraction of light within the water of the test apparatus was also developed. This model yields good results when the distance between the camera and the test specimen is large compared to the specimen height. An algorithm for a more accurate 3-D magnification correction is also presented. The use of composite and RGB (red-green-blue) color cameras is discussed and potentially significant benefits from using an RGB camera are presented.

  8. Cloud Computing Technologies in Writing Class: Factors Influencing Students’ Learning Experience

    Directory of Open Access Journals (Sweden)

    Jenny WANG

    2017-07-01

    Full Text Available The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study addresses the implementation of the most commonly used cloud applications, Google Docs, in a higher education course. The learning environment integrated Google Docs that students are using to develop and deploy writing assignments in between classes has been subjected to learning experience assessment. Using the questionnaire as an instrument to study participants (n=28, the system has provided an effective learning environment in between classes for the students and the instructor to stay connected. Factors influencing students’ learning experience based on cloud applications include frequency of interaction online and students’ technology experience. Suggestions to cope with challenges regarding the use of them in higher education including the technical issues are also presented. Educators are therefore encouraged to embrace cloud computing technologies as they design the course curriculum in hoping to effectively enrich students’ learning.

  9. Structure and Thermodynamics of Carbon Dioxide Sorption in Silica Pores from Experiments and Computer Models

    Science.gov (United States)

    Vlcek, L.; Rother, G.; Chialvo, A.; Cole, D. R.

    2011-12-01

    Injection of CO2 into geologic formations has been proposed as a key element to reduce the impact of greenhouse gases emissions. Quantitative understanding of CO2 adsorption in porous mineral environments at thermodynamic conditions relevant to proposed sequestration sites is thus a prerequisite for the assessment of their viability. In this study we use a combination of neutron scattering, adsorption experiments, and computer modeling to investigate the thermodynamics of near-critical carbon dioxide in the pores of SiO2 aerogel, which serves as a model of a high-porosity reservoir rock. Small angle neutron scattering (SANS) experiments provide input for the optimization of the computer model of the aerogel matrix, and also serve as a sensitive probe of local density changes of confined CO2 as a function of external pressure. Additional details of the aerogel basic building blocks and SiO2 surface are derived from TEM images. An independent source of global adsorption data is obtained from gravimetric experiments. The structural and thermodynamic aspects of CO2 sorption are linked using computer simulations, which include the application of the optimized diffusion limited cluster-cluster aggregation algorithm (DLCA), classical density functional theory (DFT) modeling of large-scale CO2 density profiles, and molecular dynamics simulations of the details of interactions between CO2 molecules and the amorphous silica surfaces. This integrated approach allows us to span scales ranging from 1Å to 1μm, as well as to infer the detailed structure of silica threads forming the framework of the silica matrix.

  10. Emergent Power-Law Phase in the 2D Heisenberg Windmill Antiferromagnet: A Computational Experiment

    Science.gov (United States)

    Jeevanesan, Bhilahari; Chandra, Premala; Coleman, Piers; Orth, Peter P.

    2015-10-01

    In an extensive computational experiment, we test Polyakov's conjecture that under certain circumstances an isotropic Heisenberg model can develop algebraic spin correlations. We demonstrate the emergence of a multispin U(1) order parameter in a Heisenberg antiferromagnet on interpenetrating honeycomb and triangular lattices. The correlations of this relative phase angle are observed to decay algebraically at intermediate temperatures in an extended critical phase. Using finite-size scaling we show that both phase transitions are of the Berezinskii-Kosterlitz-Thouless type, and at lower temperatures we find long-range Z6 order.

  11. REALIZATION OF COMPUTING EXPERIMENT BY DISTANCE LEARNING COURSE "BASIS OF ALGORITHMIZATION AND PROGRAMMING".

    Directory of Open Access Journals (Sweden)

    A. Spivakovsky

    2010-06-01

    Full Text Available The article presents a brief description of the features of the IDE of the course "Basis of algorithmization and programming" developed by the laboratory of integrated environments of learning RI IT in Kherson State University. As part of the disclosure of the theme "The computational experiment" are examples of solutions and analyze the efficiency of sorting algorithms for arrays of different length, is visual representation of their work means the module "Demonstration Environment", and also demonstrated several ways to determine the complexity of the execution time.

  12. The model of localized business community economic development under limited financial resources: computer model and experiment

    Directory of Open Access Journals (Sweden)

    Berg Dmitry

    2016-01-01

    Full Text Available Globalization processes now affect and are affected by most of organizations, different type resources, and the natural environment. One of the main restrictions initiated by these processes is the financial one: money turnover in global markets leads to its concentration in the certain financial centers, and local business communities suffer from the money lack. This work discusses the advantages of complementary currency introduction into a local economics. By the computer simulation with the engineered program model and the real economic experiment it was proved that the complementary currency does not compete with the traditional currency, furthermore, it acts in compliance with it, providing conditions for the sustainable business community development.

  13. A simple computational for the analysis of 2-D solute migration experiments

    International Nuclear Information System (INIS)

    Villar, Heldio Pereira

    1996-01-01

    A preliminary model for the simulation of 2-D migration patterns is presented. This computer model adopts a novel approach to the solution of the advection-dispersion equation in two dimensions through finite differences. The soil column is divided into a number of thin columns. The 1-D advection-dispersion equation is applied in the direction of flow and, using the same time increment, the 1-D diffusion equation is applied perpendicularly to the flow. The results thus obtained were compared to those of two migration experiments with two different soils. (author)

  14. A distributed, graphical user interface based, computer control system for atomic physics experiments.

    Science.gov (United States)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  15. Adsorption of probe molecules in pillared interlayered clays: Experiment and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, A., E-mail: a.gallardo@iqfr.csic.es; Guil, J. M.; Lomba, E.; Almarza, N. G.; Khatib, S. J. [Instituto de Química Física Rocasolano, CSIC, Serrano 119, E-28006 Madrid (Spain); Cabrillo, C.; Sanz, A. [Instituto de Estructura de la Materia, CSIC, Serrano 123, E-28006 Madrid (Spain); Pires, J. [Centro de Química e Bioquímica da Faculdade de Ciências, Universidade de Lisboa, 1749-016 Lisboa (Portugal)

    2014-06-14

    In this paper we investigate the adsorption of various probe molecules in order to characterize the porous structure of a series of pillared interlayered clays (PILC). To that aim, volumetric and microcalorimetric adsorption experiments were performed on various Zr PILC samples using nitrogen, toluene, and mesitylene as probe molecules. For one of the samples, neutron scattering experiments were also performed using toluene as adsorbate. Various structural models are proposed and tested by means of a comprehensive computer simulation study, using both geometric and percolation analysis in combination with Grand Canonical Monte Carlo simulations in order to model the volumetric and microcalorimetric isotherms. On the basis of this analysis, we propose a series of structural models that aim at accounting for the adsorption experimental behavior, and make possible a microscopic interpretation of the role played by the different interactions and steric effects in the adsorption processes in these rather complex disordered microporous systems.

  16. Investigation of the Feasibility of Utilizing Gamma Emission Computed Tomography in Evaluating Fission Product Migration in Irradiated TRISO Fuel Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jason M. Harp; Paul A. Demkowicz

    2014-10-01

    In the High Temperature Gas-Cooled Reactor (HTGR) the TRISO particle fuel serves as the primary fission product containment. However the large number of TRISO particles present in proposed HTGRs dictates that there will be a small fraction (~10-4 to 10-5) of as manufactured and in-pile particle failures that will lead to some fission product release. The matrix material surrounding the TRISO particles in fuel compacts and the structural graphite holding the TRISO particles in place can also serve as sinks for containing any released fission products. However data on the migration of solid fission products through these materials is lacking. One of the primary goals of the AGR-3/4 experiment is to study fission product migration from failed TRISO particles in prototypic HTGR components such as structural graphite and compact matrix material. In this work, the potential for a Gamma Emission Computed Tomography (GECT) technique to non-destructively examine the fission product distribution in AGR-3/4 components and other irradiation experiments is explored. Specifically, the feasibility of using the Idaho National Laboratory (INL) Hot Fuels Examination Facility (HFEF) Precision Gamma Scanner (PGS) system for this GECT application is considered. To test the feasibility, the response of the PGS system to idealized fission product distributions has been simulated using Monte Carlo radiation transport simulations. Previous work that applied similar techniques during the AGR-1 experiment will also be discussed as well as planned uses for the GECT technique during the post irradiation examination of the AGR-2 experiment. The GECT technique has also been applied to other irradiated nuclear fuel systems that were currently available in the HFEF hot cell including oxide fuel pins, metallic fuel pins, and monolithic plate fuel.

  17. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  18. Bayesian model calibration of computational models in velocimetry diagnosed dynamic compression experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hund, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Dynamic compression experiments are being performed on complicated materials using increasingly complex drivers. The data produced in these experiments are beginning to reach a regime where traditional analysis techniques break down; requiring the solution of an inverse problem. A common measurement in dynamic experiments is an interface velocity as a function of time, and often this functional output can be simulated using a hydrodynamics code. Bayesian model calibration is a statistical framework to estimate inputs into a computational model in the presence of multiple uncertainties, making it well suited to measurements of this type. In this article, we apply Bayesian model calibration to high pressure (250 GPa) ramp compression measurements in tantalum. We address several issues speci c to this calibration including the functional nature of the output as well as parameter and model discrepancy identi ability. Speci cally, we propose scaling the likelihood function by an e ective sample size rather than modeling the autocorrelation function to accommodate the functional output and propose sensitivity analyses using the notion of `modularization' to assess the impact of experiment-speci c nuisance input parameters on estimates of material properties. We conclude that the proposed Bayesian model calibration procedure results in simple, fast, and valid inferences on the equation of state parameters for tantalum.

  19. Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment

    Science.gov (United States)

    Hancock, Thomas M., III

    1999-01-01

    This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.

  20. Networks of conscious experience: computational neuroscience in understanding life, death, and consciousness.

    Science.gov (United States)

    Leisman, Gerry; Koch, Paul

    2009-01-01

    We demonstrate brain locations appearing to correlate with consciousness, but not being directly responsible for it. Technology reveals that brain activity is associated with consciousness but is not equivalent to it. We examine how consciousness occurs at critical levels of complexity. Conventional explanations portray consciousness as an emergent property of classical computer-like activities in the brain's neural networks. Prevailing views in this camp are that patterns of neural network activities correlate with mental states, that synchronous network oscillations in the thalamus and cerebral cortex temporally bind information, and that consciousness emerges as a novel property of computational complexity among neurons. A hard-wired theory is enigmatic for explaining consciousness because the nature of subjective experience, or 'qualia'- 'inner life' - is a "hard problem" to understand; binding spatially distributed brain activity into unitary objects, and a coherent sense of self, or 'oneness' is difficult to explain as is the transition from pre- to conscious states. Consciousness is non-computable and involves factors that are neither random nor algorithmic - consciousness cannot be simulated; explanations are also needed for free will and for subjective time flow. Convention argues that neurons and their chemical synapses are the fundamental units of information in the brain, and that conscious experience emerges when a critical level of complexity is reached in the brain's neural networks. The basic idea is that the mind is a computer functioning in the brain. In fitting the brain to a computational view, such explanations omit incompatible neurophysiological details, including widespread apparent randomness at all levels of neural processes (is it really noise, or underlying levels of complexity?); glial cells (which account for some 80% of the brain); dendritic-dendritic processing; electrotonic gap junctions; cytoplasmic/cytoskeletal activities; living

  1. Interpolation Environment of Tensor Mathematics at the Corpuscular Stage of Computational Experiments in Hydromechanics

    Science.gov (United States)

    Bogdanov, Alexander; Degtyarev, Alexander; Khramushin, Vasily; Shichkina, Yulia

    2018-02-01

    Stages of direct computational experiments in hydromechanics based on tensor mathematics tools are represented by conditionally independent mathematical models for calculations separation in accordance with physical processes. Continual stage of numerical modeling is constructed on a small time interval in a stationary grid space. Here coordination of continuity conditions and energy conservation is carried out. Then, at the subsequent corpuscular stage of the computational experiment, kinematic parameters of mass centers and surface stresses at the boundaries of the grid cells are used in modeling of free unsteady motions of volume cells that are considered as independent particles. These particles can be subject to vortex and discontinuous interactions, when restructuring of free boundaries and internal rheological states has place. Transition from one stage to another is provided by interpolation operations of tensor mathematics. Such interpolation environment formalizes the use of physical laws for mechanics of continuous media modeling, provides control of rheological state and conditions for existence of discontinuous solutions: rigid and free boundaries, vortex layers, their turbulent or empirical generalizations.

  2. Visuospatial skills and computer game experience influence the performance of virtual endoscopy.

    Science.gov (United States)

    Enochsson, Lars; Isaksson, Bengt; Tour, René; Kjellin, Ann; Hedman, Leif; Wredmark, Torsten; Tsai-Felländer, Li

    2004-11-01

    Advanced medical simulators have been introduced to facilitate surgical and endoscopic training and thereby improve patient safety. Residents trained in the Procedicus Minimally Invasive Surgical Trainer-Virtual Reality (MIST-VR) laparoscopic simulator perform laparoscopic cholecystectomy safer and faster than a control group. Little has been reported regarding whether factors like gender, computer experience, and visuospatial tests can predict the performance with a medical simulator. Our aim was to investigate whether such factors influence the performance of simulated gastroscopy. Seventeen medical students were asked about computer gaming experiences. Before virtual endoscopy, they performed the visuospatial test PicCOr, which discriminates the ability of the tested person to create a three-dimensional image from a two-dimensional presentation. Each student performed one gastroscopy (level 1, case 1) in the GI Mentor II, Simbionix, and several variables related to performance were registered. Percentage of time spent with a clear view in the endoscope correlated well with the performance on the PicSOr test (r = 0.56, P games, also seems to affect the outcome.

  3. A Bayesian Approach to the Design and Analysis of Computer Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  4. Unraveling the electrolyte properties of Na3SbS4 through computation and experiment

    Science.gov (United States)

    Rush, Larry E.; Hood, Zachary D.; Holzwarth, N. A. W.

    2017-12-01

    Solid-state sodium electrolytes are expected to improve next-generation batteries on the basis of favorable energy density and reduced cost. Na3SbS4 represents a new solid-state ion conductor with high ionic conductivities in the mS/cm range. Here, we explore the tetragonal phase of Na3SbS4 and its interface with metallic sodium anode using a combination of experiments and first-principles calculations. The computed Na-ion vacancy migration energies of 0.1 eV are smaller than the value inferred from experiment, suggesting that grain boundaries or other factors dominate the experimental systems. Analysis of symmetric cells of the electrolyte—Na/Na 3SbS4/Na —show that a conductive solid electrolyte interphase forms. Computer simulations infer that the interface is likely to be related to Na3SbS3 , involving the conversion of the tetrahedral SbS43 - ions of the bulk electrolyte into trigonal pyramidal SbS33 - ions at the interface.

  5. Research and Teaching: Computational Methods in General Chemistry--Perceptions of Programming, Prior Experience, and Student Outcomes

    Science.gov (United States)

    Wheeler, Lindsay B.; Chiu, Jennie L.; Grisham, Charles M.

    2016-01-01

    This article explores how integrating computational tools into a general chemistry laboratory course can influence student perceptions of programming and investigates relationships among student perceptions, prior experience, and student outcomes.

  6. The Influence of Trainee Gaming Experience and Computer Self-Efficacy on Learner Outcomes of Videogame-Based Learning Environments

    National Research Council Canada - National Science Library

    Orvis, Karin A; Orvis, Kara L; Belanich, James; Mullin, Laura N

    2005-01-01

    .... The purpose of the current research was to investigate the influence of two trainee characteristics, prior videogame experience and computer self-efficacy, on learner outcomes of a videogame-based training environment...

  7. Experiment Dashboard - a generic, scalable solution for monitoring of the LHC computing activities, distributed sites and services

    International Nuclear Information System (INIS)

    Andreeva, J; Cinquilli, M; Dieguez, D; Dzhunov, I; Karavakis, E; Karhula, P; Kenyon, M; Kokoszkiewicz, L; Nowotka, M; Ro, G; Saiz, P; Tuckett, D; Sargsyan, L; Schovancova, J

    2012-01-01

    The Experiment Dashboard system provides common solutions for monitoring job processing, data transfers and site/service usability. Over the last seven years, it proved to play a crucial role in the monitoring of the LHC computing activities, distributed sites and services. It has been one of the key elements during the commissioning of the distributed computing systems of the LHC experiments. The first years of data taking represented a serious test for Experiment Dashboard in terms of functionality, scalability and performance. And given that the usage of the Experiment Dashboard applications has been steadily increasing over time, it can be asserted that all the objectives were fully accomplished.

  8. Engaging Women in Computer Science and Engineering: Promising Practices for Promoting Gender Equity in Undergraduate Research Experiences

    Science.gov (United States)

    Kim, Karen A.; Fann, Amy J.; Misa-Escalante, Kimberly O.

    2011-01-01

    Building on research that identifies and addresses issues of women's underrepresentation in computing, this article describes promising practices in undergraduate research experiences that promote women's long-term interest in computer science and engineering. Specifically, this article explores whether and how REU programs include programmatic…

  9. More Ideas for Monitoring Biological Experiments with the BBC Computer: Absorption Spectra, Yeast Growth, Enzyme Reactions and Animal Behaviour.

    Science.gov (United States)

    Openshaw, Peter

    1988-01-01

    Presented are five ideas for A-level biology experiments using a laboratory computer interface. Topics investigated include photosynthesis, yeast growth, animal movements, pulse rates, and oxygen consumption and production by organisms. Includes instructions specific to the BBC computer system. (CW)

  10. Teachers' Experiences of Using Eye Gaze-Controlled Computers for Pupils with Severe Motor Impairments and without Speech

    Science.gov (United States)

    Rytterström, Patrik; Borgestig, Maria; Hemmingsson, Helena

    2016-01-01

    The purpose of this study is to explore teachers' experiences of using eye gaze-controlled computers with pupils with severe disabilities. Technology to control a computer with eye gaze is a fast growing field and has promising implications for people with severe disabilities. This is a new assistive technology and a new learning situation for…

  11. POSSIBILITY OF COMPUTER EXPERIMENT IN STUDY OF ANIMAL SPERMATOZOA HETEROGENEITY L. V. Gorbunov, Y. M. Mazharova

    Directory of Open Access Journals (Sweden)

    L. V.

    2016-04-01

    Full Text Available A simulation model for evaluating the survival and fertilizing capacity of animal spermatozoa was developed, taking into account the initial condition of the sperm and the effectiveness of cryopreservation stages. The model is based on an analytical expression that reflects the main reasons for the survival of reproductive cells in onto-, techno- and phylogenesis. The decrease in spermatozoa resistance depends on a number of biological factors — the animal species, physiological conditions of sperm donor and recipient, the ejaculate quality, and technological factors — the effectiveness of the methods of cell cryopreservation and egg insemination. The discrepancy between the results of cell motility obtained by calculation and experimental methods amounted to less than 2% as a result of our own experiments and to less than 5% for the data taken from literature. A feature of the model is the complete independence of the effectiveness of studied techniques from the heterogeneity of animal sperm. The conducted computer experiment showed that the difference between the values of initial motility and fertilizing capacity of sperm varies from 50 to 100% depending on the difference of biological parameters, while the index of the effectiveness of selected technique creates an error of about 1%. Comparative analysis of alternative technologies of spermatozoa cryopreservation showed the maximum efficiency of the stages of cryoprotectant use, freeze mode, survival and fertilizing capacity of the object. The use of computer modeling allows to greatly reduce the spread in spermatozoa preservation values that were obtained in different experiments, and thus to reduce the time and costs it takes to obtain reliable results.

  12. Robotics as an integration subject in the computer science university studies. The experience of the University of Almeria

    Directory of Open Access Journals (Sweden)

    Manuela Berenguel Soria

    2012-11-01

    Full Text Available This work presents a global view of the role of robotics in computer science studies, mainly in university degrees. The main motivation of the use of robotics in these studies deals with the following issues: robotics permits to put in practice many computer science fundamental topics, it is a multidisciplinary area which allows to complete the basic knowledge of any computer science student, it facilitates the practice and learning of basic competences of any engineer (for instance, teamwork, and there is a wide market looking for people with robotics knowledge. These ideas are discussed from our own experience in the University of Almeria acquired through the studies of Computer Science Technical Engineering, Computer Science Engineering, Computer Science Degree and Computer Science Postgraduate.

  13. Computer-assisted comparison of analysis and test results in transportation experiments

    International Nuclear Information System (INIS)

    Knight, R.D.; Ammerman, D.J.; Koski, J.A.

    1998-01-01

    As a part of its ongoing research efforts, Sandia National Laboratories' Transportation Surety Center investigates the integrity of various containment methods for hazardous materials transport, subject to anomalous structural and thermal events such as free-fall impacts, collisions, and fires in both open and confined areas. Since it is not possible to conduct field experiments for every set of possible conditions under which an actual transportation accident might occur, accurate modeling methods must be developed which will yield reliable simulations of the effects of accident events under various scenarios. This requires computer software which is capable of assimilating and processing data from experiments performed as benchmarks, as well as data obtained from numerical models that simulate the experiment. Software tools which can present all of these results in a meaningful and useful way to the analyst are a critical aspect of this process. The purpose of this work is to provide software resources on a long term basis, and to ensure that the data visualization capabilities of the Center keep pace with advancing technology. This will provide leverage for its modeling and analysis abilities in a rapidly evolving hardware/software environment

  14. Investigation of Coal-biomass Catalytic Gasification using Experiments, Reaction Kinetics and Computational Fluid Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, Francine [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Agblevor, Foster [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Klein, Michael [Univ. of Delaware, Newark, DE (United States); Sheikhi, Reza [Northeastern Univ., Boston, MA (United States)

    2015-12-31

    A collaborative effort involving experiments, kinetic modeling, and computational fluid dynamics (CFD) was used to understand co-gasification of coal-biomass mixtures. The overall goal of the work was to determine the key reactive properties for coal-biomass mixed fuels. Sub-bituminous coal was mixed with biomass feedstocks to determine the fluidization and gasification characteristics of hybrid poplar wood, switchgrass and corn stover. It was found that corn stover and poplar wood were the best feedstocks to use with coal. The novel approach of this project was the use of a red mud catalyst to improve gasification and lower gasification temperatures. An important results was the reduction of agglomeration of the biomass using the catalyst. An outcome of this work was the characterization of the chemical kinetics and reaction mechanisms of the co-gasification fuels, and the development of a set of models that can be integrated into other modeling environments. The multiphase flow code, MFIX, was used to simulate and predict the hydrodynamics and co-gasification, and results were validated with the experiments. The reaction kinetics modeling was used to develop a smaller set of reactions for tractable CFD calculations that represented the experiments. Finally, an efficient tool was developed, MCHARS, and coupled with MFIX to efficiently simulate the complex reaction kinetics.

  15. Scalability Dilemma and Statistic Multiplexed Computing — A Theory and Experiment

    Directory of Open Access Journals (Sweden)

    Justin Yuan Shi

    2017-08-01

    Full Text Available The For the last three decades, end-to-end computing paradigms, such as MPI (Message Passing Interface, RPC (Remote Procedure Call and RMI (Remote Method Invocation, have been the de facto paradigms for distributed and parallel programming. Despite of the successes, applications built using these paradigms suffer due to the proportionality factor of crash in the application with its size. Checkpoint/restore and backup/recovery are the only means to save otherwise lost critical information. The scalability dilemma is such a practical challenge that the probability of the data losses increases as the application scales in size. The theoretical significance of this practical challenge is that it undermines the fundamental structure of the scientific discovery process and mission critical services in production today. In 1997, the direct use of end-to-end reference model in distributed programming was recognized as a fallacy. The scalability dilemma was predicted. However, this voice was overrun by the passage of time. Today, the rapidly growing digitized data demands solving the increasingly critical scalability challenges. Computing architecture scalability, although loosely defined, is now the front and center of large-scale computing efforts. Constrained only by the economic law of diminishing returns, this paper proposes a narrow definition of a Scalable Computing Service (SCS. Three scalability tests are also proposed in order to distinguish service architecture flaws from poor application programming. Scalable data intensive service requires additional treatments. Thus, the data storage is assumed reliable in this paper. A single-sided Statistic Multiplexed Computing (SMC paradigm is proposed. A UVR (Unidirectional Virtual Ring SMC architecture is examined under SCS tests. SMC was designed to circumvent the well-known impossibility of end-to-end paradigms. It relies on the proven statistic multiplexing principle to deliver reliable service

  16. ATLAS Distributed Computing Experience and Performance During the LHC Run-2

    Science.gov (United States)

    Filipčič, A.; ATLAS Collaboration

    2017-10-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of the new model was demonstrated through the delivery of analysis datasets to users just one week after data taking, by completing the calibration loop, Tier-0 processing and train production steps promptly. The great flexibility of the new system also makes it possible to execute part of the Tier-0 processing on the grid when Tier-0 resources experience a backlog during high data-taking periods. The introduction of the data lifetime model, where each dataset is assigned a finite lifetime (with extensions possible for frequently accessed data), was made possible by Rucio. Thanks to this the storage crises experienced in Run-1 have not reappeared during Run-2. In addition, the distinction between Tier-1 and Tier-2 disk storage, now largely artificial given the quality of Tier-2 resources and their networking, has been removed through the introduction of dynamic ATLAS clouds that group the storage endpoint nucleus and its close-by execution satellite sites. All stable

  17. From mainframe to Web-based: 30 years of experience in computer-aided instruction of pharmacology.

    Science.gov (United States)

    Kerecsen, Laszlo; Pazdernik, Thomas L

    2002-07-01

    This review describes 30 years of experience at the University of Kansas Medical Center in using computers in the teaching of pharmacology to medical students and other health professionals. The Computer-Assisted Teaching System contains both Computer-Assisted Instruction (CAI) and Computer-Managed Instruction (CMI). The system has evolved from mainframe to microprocessors to the current World Wide Web system. The greatest challenge has been to meet the changes in technologies and teaching approaches. The system has been well received by students and has provided the faculty with the means of providing a novel approach to teaching pharmacology.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  19. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  20. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Science.gov (United States)

    Jongeneel, C Victor; Achinike-Oduaran, Ovokeraye; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Akanle, Bola; Aron, Shaun; Ashano, Efejiro; Bendou, Hocine; Botha, Gerrit; Chimusa, Emile; Choudhury, Ananyo; Donthu, Ravikiran; Drnevich, Jenny; Falola, Oluwadamila; Fields, Christopher J; Hazelhurst, Scott; Hendry, Liesl; Isewon, Itunuoluwa; Khetani, Radhika S; Kumuthini, Judit; Kimuda, Magambo Phillip; Magosi, Lerato; Mainzer, Liudmila Sergeevna; Maslamoney, Suresh; Mbiyavanga, Mamana; Meintjes, Ayton; Mugutso, Danny; Mpangase, Phelelani; Munthali, Richard; Nembaware, Victoria; Ndhlovu, Andrew; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Panji, Sumir; Pillay, Venesa; Rendon, Gloria; Sengupta, Dhriti; Mulder, Nicola

    2017-06-01

    The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa) program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  1. Experiments and computations on coaxial swirling jets with centerbody in an axisymmetric combustor

    International Nuclear Information System (INIS)

    Chao, Y.C.; Ho, W.C.; Lin, S.K.

    1987-01-01

    Experiments and computations of turbulent, confined, coannular swirling flows have been performed in a model combustor. Numerical results are obtained by means of a revised two-equation model of turbulence. The combustor consists of two confined, concentric, swirling jets and a centerbody at the center of the inlet. Results are reported for cold flow conditions under co- and counter-swirl. The numerical results agree with the experimental data under both conditions. The size of the central recirculation zone is dominated by the strength of the outer swirl. A two-cell recirculation zone may be formed due to the presence of the swirler hub. The mechanism of interaction between the separation bubble at the hub of the swirler and the central recirculation zone due to vortex breakdown is also investigated. 18 references

  2. Structure and dynamics of gas phase ions: Interplay between experiments and computations in IRMPD spectroscopy

    Science.gov (United States)

    Coletti, Cecilia; Corinti, Davide; Paciotti, Roberto; Re, Nazzareno; Crestoni, Maria Elisa; Fornarini, Simonetta

    2017-11-01

    The investigation of the molecular structure and dynamics of ions in gas phase is an item of increasing interest, due the role such species play in many areas of chemistry and physics, not to mention that they often represent elusive intermediates in more complex reaction mechanisms. Infrared Multiple Photon Dissociation spectroscopy is today one of the most advanced technique to this purpose, because of its high sensitivity to even small structure changes. The interpretation of IRMPD spectra strongly relies on high level quantum mechanical computations, so that a close interplay is needed for a detailed understanding of structure and kinetics properties which can be gathered from the many applications of this powerful technique. Recent advances in experiment and theory in this field are here illustrated, with emphasis on recent progresses for the elucidation of the mechanism of action of cisplatin, one of the most widely used anticancer drugs.

  3. Structural transformations at the initial stages of fragmentation of plastically deformed polycrystals: A computer experiment

    Science.gov (United States)

    Rybin, V. V.; Perevezentsev, V. N.; Svirina, Yu. V.

    2017-05-01

    Results have been presented for a computer experiment on concurrent micro-, meso-, and macroscopic studies of the evolution of dislocation structure in a large (adjacent to one of the junctions) domain of a grain after its constant-rate macroplastic deformation to an extent that corresponds to the onset of the stage of developed plastic deformation. The type of dislocation-density and dislocation-charge distributions, as well as amounts and degrees of inhomogeneity in local plastic deformation, have been analyzed. The type of dislocation rearrangements at the junctions and fractures of high-angle grain boundaries has been established, which is responsible for the formation of the first dangling dislocation boundaries, which are mesodefects that trigger fragmentation.

  4. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    International Nuclear Information System (INIS)

    Nash, T.

    1989-05-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC systems, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described. 6 figs

  5. Inequality measures perform differently in global and local assessments: An exploratory computational experiment

    Science.gov (United States)

    Chiang, Yen-Sheng

    2015-11-01

    Inequality measures are widely used in both the academia and public media to help us understand how incomes and wealth are distributed. They can be used to assess the distribution of a whole society-global inequality-as well as inequality of actors' referent networks-local inequality. How different is local inequality from global inequality? Formalizing the structure of reference groups as a network, the paper conducted a computational experiment to see how the structure of complex networks influences the difference between global and local inequality assessed by a selection of inequality measures. It was found that local inequality tends to be higher than global inequality when population size is large; network is dense and heterophilously assorted, and income distribution is less dispersed. The implications of the simulation findings are discussed.

  6. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Directory of Open Access Journals (Sweden)

    C Victor Jongeneel

    2017-06-01

    Full Text Available The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  7. Computational experiences with variable modulus, elastic-plastic, and viscoelastic concrete models

    International Nuclear Information System (INIS)

    Anderson, C.A.

    1981-01-01

    Six years ago the Reactor Safety Research Division of the Nuclear Regulatory Commission (NRC) approached the Los Alamos National Laboratory to develop a comprehensive concrete structural analysis code to predict the static and dynamic behavior of Prestressed Concrete Reactor Vessels (PCRVs) that serve as the containment structure of a High-Temperature Gas-Cooled Reactor. The PCRV is a complex concrete structure that must be modeled in three dimensions and posseses other complicating features such as a steel liner for the reactor cavity and woven cables embedded vertically in the PCRV and wound circumferentially on the outside of the PCRV. The cables, or tendons, are used for prestressing the reactor vessel. In addition to developing the computational capability to predict inelastic three dimensional concrete structural behavior, the code response was verified against documented experiments on concrete structural behavior. This code development/verification effort is described

  8. Experiment and computation: a combined approach to study the van der Waals complexes

    Directory of Open Access Journals (Sweden)

    Surin L.A.

    2017-01-01

    Full Text Available A review of recent results on the millimetre-wave spectroscopy of weakly bound van der Waals complexes, mostly those which contain H2 and He, is presented. In our work, we compared the experimental spectra to the theoretical bound state results, thus providing a critical test of the quality of the M–H2 and M–He potential energy surfaces (PESs which are a key issue for reliable computations of the collisional excitation and de-excitation of molecules (M = CO, NH3, H2O in the dense interstellar medium. The intermolecular interactions with He and H2 play also an important role for high resolution spectroscopy of helium or para-hydrogen clusters doped by a probe molecule (CO, HCN. Such experiments are directed on the detection of superfluid response of molecular rotation in the He and p-H2 clusters.

  9. Comparing Experiment and Computation of Hypersonic Laminar Boundary Layers with Isolated Roughness

    Science.gov (United States)

    Bathel, Brett F.; Iyer, Prahladh S.; Mahesh, Krishnan; Danehy, Paul M.; Inman, Jennifer A.; Jones, Stephen B.; Johansen, Craig T.

    2014-01-01

    Streamwise velocity profile behavior in a hypersonic laminar boundary layer in the presence of an isolated roughness element is presented for an edge Mach number of 8.2. Two different roughness element types are considered: a 2-mm tall, 4-mm diameter cylinder, and a 2-mm radius hemisphere. Measurements of the streamwise velocity behavior using nitric oxide (NO) planar laser-induced fluorescence (PLIF) molecular tagging velocimetry (MTV) have been performed on a 20-degree wedge model. The top surface of this model acts as a flat-plate and is oriented at 5 degrees with respect to the freestream flow. Computations using direct numerical simulation (DNS) of these flows have been performed and are compared to the measured velocity profiles. Particular attention is given to the characteristics of velocity profiles immediately upstream and downstream of the roughness elements. In these regions, the streamwise flow can experience strong deceleration or acceleration. An analysis in which experimentally measured MTV profile displacements are compared with DNS particle displacements is performed to determine if the assumption of constant velocity over the duration of the MTV measurement is valid. This assumption is typically made when reporting MTV-measured velocity profiles, and may result in significant errors when comparing MTV measurements to computations in regions with strong deceleration or acceleration. The DNS computations with the cylindrical roughness element presented in this paper were performed with and without air injection from a rectangular slot upstream of the cylinder. This was done to determine the extent to which gas seeding in the MTV measurements perturbs the boundary layer flowfield.

  10. Sequential designs for sensitivity analysis of functional inputs in computer experiments

    International Nuclear Information System (INIS)

    Fruth, J.; Roustant, O.; Kuhnt, S.

    2015-01-01

    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. - Highlights: • Sensitivity analysis method for functional and scalar inputs is presented. • We focus on the discovery of most influential parts of the functional domain. • We investigate economical sequential methodology based on piecewise constant functions. • Normalized sensitivity indices are introduced and investigated theoretically. • Successful application to sheet metal forming on two functional inputs

  11. Successful experiences in the application of Concept Maps in Engineering in Computing, Mexico

    Directory of Open Access Journals (Sweden)

    Beatriz Guardian Soto

    2013-02-01

    Full Text Available Today there is an enormous amount of work related to new models and styles of learning and instruction in the field of engineering. In the case of the engineering degree in computing that is taught in the Mexico National Polytechnic Institute (IPN, there is a working group led by an expert of international waisted whose success and work thereon, processes are reflected in this text through experiences gained in the last 8 years with students and teachers, thus generatingthe requirements and tools for the globalised world and the knowledge society in which we find ourselves. Lessons learned are in subjects as the theory of automata (TA, compilers (Cs, analysis of algorithms (AA, (R, Artificial Intelligence (AI, computer programming (P networks, degree project (PT and strategic planning (PE mainly, among others to facilitate the understanding of concepts and applications by the student and believe that through the teaching strategy using concept maps developed by j. Novak results have been favorable in dynamism, understanding and generating meaningful learning in the long term, providing well, solid elements for your professional practice. Listed proposals obtained by teachers and exercises developed by teachers and students.

  12. Modelling of Multi-Agent Systems: Experiences with Membrane Computing and Future Challenges

    Directory of Open Access Journals (Sweden)

    Petros Kefalas

    2010-08-01

    Full Text Available Formal modelling of Multi-Agent Systems (MAS is a challenging task due to high complexity, interaction, parallelism and continuous change of roles and organisation between agents. In this paper we record our research experience on formal modelling of MAS. We review our research throughout the last decade, by describing the problems we have encountered and the decisions we have made towards resolving them and providing solutions. Much of this work involved membrane computing and classes of P Systems, such as Tissue and Population P Systems, targeted to the modelling of MAS whose dynamic structure is a prominent characteristic. More particularly, social insects (such as colonies of ants, bees, etc., biology inspired swarms and systems with emergent behaviour are indicative examples for which we developed formal MAS models. Here, we aim to review our work and disseminate our findings to fellow researchers who might face similar challenges and, furthermore, to discuss important issues for advancing research on the application of membrane computing in MAS modelling.

  13. Quantum Information, computation and cryptography. An introductory survey of theory, technology and experiments

    Energy Technology Data Exchange (ETDEWEB)

    Benatti, Fabio [Trieste Univ., Miramare (Italy). Dipt. Fisica Teorica; Fannes, Mark [Leuven Univ. (Belgium). Inst. voor Theoretische Fysica; Floreanini, Roberto [INFN, Trieste (Italy). Dipt. di Fisica Teorica; Petritis, Dimitri (eds.) [Rennes 1 Univ., 35 (France). Inst. de Recherche Mathematique de Rennes

    2010-07-01

    This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation. (orig.)

  14. Experience in nuclear materials accountancy, including the use of computers, in the UKAEA

    International Nuclear Information System (INIS)

    Anderson, A.R.; Adamson, A.S.; Good, P.T.; Terrey, D.R.

    1976-01-01

    The UKAEA have operated systems of nuclear materials accountancy in research and development establishments handling large quantities of material for over 20 years. In the course of that time changing requirements for nuclear materials control and increasing quantities of materials have required that accountancy systems be modified and altered to improve either the fundamental system or manpower utilization. The same accountancy principles are applied throughout the Authority but procedures at the different establishments vary according to the nature of their specific requirements; there is much in the cumulative experience of the UKAEA which could prove of value to other organizations concerned with nuclear materials accountancy or safeguards. This paper reviews the present accountancy system in the UKAEA and summarizes its advantages. Details are given of specific experience and solutions which have been found to overcome difficulties or to strengthen previous weak points. Areas discussed include the use of measurements, the establishment of measurement points (which is relevant to the designation of MBAs), the importance of regular physical stock-taking, and the benefits stemming from the existence of a separate accountancy section independent of operational management at large establishments. Some experience of a dual system of accountancy and criticality control is reported, and the present status of computerization of nuclear material accounts is summarized. Important aspects of the relationship between management systems of accountancy and safeguards' requirements are discussed briefly. (author)

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  16. When STAR meets the Clouds-Virtualization and Cloud Computing Experiences

    International Nuclear Information System (INIS)

    Lauret, J; Hajdu, L; Walker, M; Balewski, J; Goasguen, S; Stout, L; Fenn, M; Keahey, K

    2011-01-01

    In recent years, Cloud computing has become a very attractive paradigm and popular model for accessing distributed resources. The Cloud has emerged as the next big trend. The burst of platform and projects providing Cloud resources and interfaces at the very same time that Grid projects are entering a production phase in their life cycle has however raised the question of the best approach to handling distributed resources. Especially, are Cloud resources scaling at the levels shown by Grids? Are they performing at the same level? What is their overhead on the IT teams and infrastructure? Rather than seeing the two as orthogonal, the STAR experiment has viewed them as complimentary and has studied merging the best of the two worlds with Grid middleware providing the aggregation of both Cloud and traditional resources. Since its first use of Cloud resources on Amazon EC2 in 2008/2009 using a Nimbus/EC2 interface, the STAR software team has tested and experimented with many novel approaches: from a traditional, native EC2 approach to the Virtual Organization Cluster (VOC) at Clemson University and Condor/VM on the GLOW resources at the University of Wisconsin. The STAR team is also planning to run as part of the DOE/Magellan project. In this paper, we will present an overview of our findings from using truly opportunistic resources and scaling-out two orders of magnitude in both tests and practical usage.

  17. Computer simulation of void formation in residual gas atom free metals by dual beam irradiation experiments

    International Nuclear Information System (INIS)

    Shimomura, Y.; Nishiguchi, R.; La Rubia, T.D. de; Guinan, M.W.

    1992-01-01

    In our recent experiments (1), we found that voids nucleate at vacancy clusters which trap gas atoms such as hydrogen and helium in ion- and neutron-irradiated copper. A molecular dynamics computer simulation, which implements an empirical embedded atom method to calculate forces that act on atoms in metals, suggests that a void nucleation occurs in pure copper at six and seven vacancy clusters. The structure of six and seven vacancy clusters in copper fluctuates between a stacking fault tetrahedron and a void. When a hydrogen is trapped at voids of six and seven vacancy, a void can keep their structure for appreciably long time; that is, the void do not relax to a stacking fault tetrahedron and grows to a large void. In order to explore the detailed atomics of void formation, it is emphasized that dual-beam irradiation experiments that utilize beams of gas atoms and self-ions should be carried out with residual gas atom free metal specimens. (author)

  18. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  19. Evaluating First Experiences with an Educational Computer Game: A multi-Method Approach

    Directory of Open Access Journals (Sweden)

    Marianna Obrist

    2011-10-01

    Full Text Available This paper presents our evaluation approach for a specific case study, namely the evaluation of an early prototype of an educational game with children aged between 12 and 14 years. The main goal of this initial evaluation study was to explore children’s first impressions and experiences of the game on the one hand and to assess the students’ ideas and wishes for the further development of the game on the other hand. The main challenge for the evaluation activities was the selection of the appropriate methodological approach, taking into account children as a special user group. We opted for a combination of different, mainly qualitative and explorative methods that were reported beneficial for work with children in the human-computer interaction (HCI field. By presenting our multi-method approach, in particular the different steps and procedure within our study, other researchers can get inspirations for follow up activities when evaluating games with children as well as benefit from our experiences in exploring more collaborative methods and methodological combinations.

  20. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  2. Computational Fluid Dynamics Modeling and Validating Experiments of Airflow in a Data Center

    Directory of Open Access Journals (Sweden)

    Emelie Wibron

    2018-03-01

    Full Text Available The worldwide demand on data storage continues to increase and both the number and the size of data centers are expanding rapidly. Energy efficiency is an important factor to consider in data centers since the total energy consumption is huge. The servers must be cooled and the performance of the cooling system depends on the flow field of the air. Computational Fluid Dynamics (CFD can provide detailed information about the airflow in both existing data centers and proposed data center configurations before they are built. However, the simulations must be carried out with quality and trust. The k– ε model is the most common choice to model the turbulent airflow in data centers. The aim of this study is to examine the performance of more advanced turbulence models, not previously investigated for CFD modeling of data centers. The considered turbulence models are the k– ε model, the Reynolds Stress Model (RSM and Detached Eddy Simulations (DES. The commercial code ANSYS CFX 16.0 is used to perform the simulations and experimental values are used for validation. It is clarified that the flow field for the different turbulence models deviate at locations that are not in the close proximity of the main components in the data center. The k– ε model fails to predict low velocity regions. RSM and DES produce very similar results and, based on the solution times, it is recommended to use RSM to model the turbulent airflow data centers.

  3. Computational design of auxotrophy-dependent microbial biosensors for combinatorial metabolic engineering experiments.

    Science.gov (United States)

    Tepper, Naama; Shlomi, Tomer

    2011-01-21

    Combinatorial approaches in metabolic engineering work by generating genetic diversity in a microbial population followed by screening for strains with improved phenotypes. One of the most common goals in this field is the generation of a high rate chemical producing strain. A major hurdle with this approach is that many chemicals do not have easy to recognize attributes, making their screening expensive and time consuming. To address this problem, it was previously suggested to use microbial biosensors to facilitate the detection and quantification of chemicals of interest. Here, we present novel computational methods to: (i) rationally design microbial biosensors for chemicals of interest based on substrate auxotrophy that would enable their high-throughput screening; (ii) predict engineering strategies for coupling the synthesis of a chemical of interest with the production of a proxy metabolite for which high-throughput screening is possible via a designed bio-sensor. The biosensor design method is validated based on known genetic modifications in an array of E. coli strains auxotrophic to various amino-acids. Predicted chemical production rates achievable via the biosensor-based approach are shown to potentially improve upon those predicted by current rational strain design approaches. (A Matlab implementation of the biosensor design method is available via http://www.cs.technion.ac.il/~tomersh/tools).

  4. Initial experience with computed tomographic colonography applied for noncolorectal cancerous conditions

    International Nuclear Information System (INIS)

    Ichikawa, Tamaki; Kawada, Shuichi; Hirata, Satoru; Ikeda, Shu; Sato, Yuuki; Imai, Yutaka

    2011-01-01

    The aim of this study was to asses retrospectively the performance of computed tomography colonography (CTC) for noncolorectal cancerous conditions. A total of 44 patients with non-colorectal cancerous conditions underwent CTC. We researched the indications for CTC or present illness and evaluated the CTC imaging findings. We assessed whether diagnosis by CTC reduced conventional colonoscopic examinations. A total of 47 examinations were performed in 44 patients. The indications for CTC or a present illness were as follows: 15 patients with impossible or incomplete colonoscopy, 7 with diverticular disease, 6 with malignancy (noncolorectal cancer), 6 with Crohn's disease, 4 suspected to have a submucosal tumor on colonoscopy, 2 with ischemic colitis, and 4 with various other diseases. Colonic findings were diagnosed on CTC in 36 examinations, and extracolonic findings were identified in 35 of 44 patients. In all, 17 patients had undergone colonoscopy previously, 9 (52.9%) of whom did not require further colonoscopy by CTC. Five patients underwent colonoscopy after CTC. The indications for CTC were varied for patients with noncolorectal cancerous conditions. CTC examinations could be performed safely. Unlike colonoscopy or CT without preparation, CTC revealed colonic and extracolonic findings and may reduce the indication of colonoscopy in patients with noncolorectal cancerous conditions. (author)

  5. The influence of computer experience on visuo-motor control: implications for visuo-motor testing in Parkinson's disease

    NARCIS (Netherlands)

    Stoffers, D.; Berendse, H.W.; Deijen, J.B.; Wolters, E.C.M.J.

    2002-01-01

    Abnormalities in visuo-motor control have repeatedly been reported in Parkinson's disease (PD) patients. In the more recent studies, tasks measuring visuo-motor performance are usually computerised tasks requiring the use of a mouse-like manipulandum. In healthy subjects, previous computer mouse

  6. FCJ-133 The Scripted Spaces of Urban Ubiquitous Computing: The experience, poetics, and politics of public scripted space

    Directory of Open Access Journals (Sweden)

    Christian Ulrik Andersen

    2011-12-01

    Full Text Available This article proposes and introduces the concept of ‘scripted space’ as a new perspective on ubiquitous computing in urban environments. Drawing on urban history, computer games, and a workshop study of the city of Lund the article discusses the experience of digitally scripted spaces, and their relation to the history of public spaces. In conclusion, the article discusses the potential for employing scripted spaces as a reinvigoration of urban public space.

  7. Experience with on-demand physics simulations on the Sun Microsystems computing facility (SunGrid) at network.com

    Energy Technology Data Exchange (ETDEWEB)

    Lauret, J; Potekhin, M; Carcassi, G; Shamash, A; Valia, R [Brookhaven National Laboratory, Upton, NY11973 (United States); Sun Microsystems, Inc. 4150 Network Circle, Santa Clara, CA95054 (United States)], E-mail: jeromel@bnl.gov

    2008-07-15

    The simulation program of the STAR experiment at the Relativistic Heavy Ion Collider (Brookhaven National Laboratory) is growing in scope and its responsiveness to the needs of the research community. In addition, there is a significant ongoing R and D activity focused on future upgrades of the STAR detector, which also requires extensive simulations support. In addition to the local computing facility, the Open Science Grid (OSG) resources have been successfully used in STAR. However, the explosive growth of both computational needs and the available computing power, combined with distributed nature of the latter, dictate that all available options are considered - from open source to commercial grids. The computing facility of Sun Microsystems (the SunGrid) aims to deliver enterprise computing power and resources over the Internet, enabling developers, researchers, scientists and businesses to optimize performance and speed time to results without investment in IT infrastructure.

  8. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  12. In Search of the Physics: The Interplay of Experiment and Computation in Airframe Noise Research: Flap-Edge Noise

    Science.gov (United States)

    Streett, C. L.; Lockard, D. P.; Singer, B. A.; Khorrami, M. R.; Choudhari, M. M.

    2003-01-01

    The LaRC investigative process for airframe noise has proven to be a useful guide for elucidation of the physics of flow-induced noise generation over the last five years. This process, relying on a close interplay between experiment and computation, is described and demonstrated here on the archetypal problem of flap-edge noise. Some detailed results from both experiment and computation are shown to illustrate the process, and a description of the multi-source physics seen in this problem is conjectured.

  13. Development of a real-time monitoring system and integration of different computer system in LHD experiments using IP multicast

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, Masahiko; Nakamura, Yukio [National Inst. for Fusion Science, Toki, Gifu (Japan); Teramachi, Yasuaki [Polytechnic Univ., Sagamihara, Kanagawa (Japan); Okumura, Haruhiko [Matsusaka Univ., Matsusaka, Mie (Japan); Yamaguchi, Satarou [Chubu Univ., Kasugai, Aichi (Japan)

    2002-10-01

    There are several different computer systems in LHD (Large Helical Device) experiment, and therefore the coalition of these computers is a key to perform the experiment. Real-time monitoring system is also important because the long discharge is needed in the LHD experiment. In order to achieve these two requirements, the technique of IP multicast is adopted. The authors have developed three new systems, the first one is the real-time monitoring system, the next one is the delivery system of the shot number and the last one is the real-time notification system of the plasma data registration. The first system can deliver the real-time monitoring data to the LHD experimental LAN through the firewall of the LHD control LAN in NIFS. The other two systems are used to realize high coalition of the different computers in the LHD plasma experiment. We can conclude that IP multicast is very useful both in the LHD experiment and a future large plasma experiment from various experiences. (author)

  14. Development of a real-time monitoring system and integration of different computer system in LHD experiments using IP multicast

    International Nuclear Information System (INIS)

    Emoto, Masahiko; Nakamura, Yukio; Teramachi, Yasuaki; Okumura, Haruhiko; Yamaguchi, Satarou

    2002-01-01

    There are several different computer systems in LHD (Large Helical Device) experiment, and therefore the coalition of these computers is a key to perform the experiment. Real-time monitoring system is also important because the long discharge is needed in the LHD experiment. In order to achieve these two requirements, the technique of IP multicast is adopted. The authors have developed three new systems, the first one is the real-time monitoring system, the next one is the delivery system of the shot number and the last one is the real-time notification system of the plasma data registration. The first system can deliver the real-time monitoring data to the LHD experimental LAN through the firewall of the LHD control LAN in NIFS. The other two systems are used to realize high coalition of the different computers in the LHD plasma experiment. We can conclude that IP multicast is very useful both in the LHD experiment and a future large plasma experiment from various experiences. (author)

  15. Numerical Experiments on the Computation of Ground Surface Temperature in an Atmospheric Circulation Model

    Science.gov (United States)

    computation of the ground surface temperature. It is hoped that this discussion will contribute to the improvement of the accuracy of computed ground surface temperature in the simulation of climatic changes .

  16. Simulation of a simple RCCS experiment with RELAP5-3D system code and computational fluid dynamics computer program

    International Nuclear Information System (INIS)

    Vaghetto, R.; Wei, H.; Hassan, Y.A.

    2011-01-01

    A small scale experimental facility was designed to study the thermal hydraulic phenomena in the Reactor Cavity Cooling System (RCCS). The facility was scaled down from the full scale RCCS system by applying scaling laws. A set of RELAP5-3D simulations were performed to confirm the scaling calculations, and to refine and optimize the facility's configuration, instrumentation selection, and layout. Computational Fluid Dynamics (CFD) calculations using StarCCM+ were performed in order to study the flow patterns and two-phase water behavior in selected locations of the facility where expected complex flow structure occurs. (author)

  17. Computers in medical education 2. Use of a computer package to supplement the clinical experience in a surgical clerkship: an objective evaluation.

    Science.gov (United States)

    Devitt, P; Cehic, D; Palmer, E

    1998-06-01

    Student teaching of surgery has been devolved from the university in an effort to increase and broaden undergraduate clinical experience. In order to ensure uniformity of learning we have defined learning objectives and provided a computer-based package to supplement clinical teaching. A study was undertaken to evaluate the place of computer-based learning in a clinical environment. Twelve modules were provided for study during a 6-week attachment. These covered clinical problems related to cardiology, neurosurgery and gastrointestinal haemorrhage. Eighty-four fourth-year students undertook a pre- and post-test assessment on these three topics as well as acute abdominal pain. No extra learning material on the latter topic was provided during the attachment. While all students showed significant improvement in performance in the post-test assessment, those who had access to the computer material performed significantly better than did the controls. Within the topics, students in both groups performed equally well on the post-test assessment of acute abdominal pain but the control group's performance was significantly lacking on the topic of gastrointestinal haemorrhage, suggesting that the bulk of learning on this subject came from the computer material and little from the clinical attachment. This type of learning resource can be used to supplement the student's clinical experience and at the same time monitor what they learn during clinical clerkships and identify areas of weakness.

  18. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  19. INFN Tier-1 experiences with Castor-2 in CMS computing challenges

    CERN Document Server

    AUTHOR|(CDS)2108873

    2007-01-01

    The CMS combined Computing, Software and Analysis challenge of 2006 (CSA06) is a 50 million event exercise to test the workflow and dataflow associated with the data handling model of CMS. It was designed to be a fully Grid-enabled, 25% capacity exercise of what is needed for CMS operations in 2008. All CMS Tier1’s participated, and the INFN Tier-1 - located at CNAF, Bologna, Italy - joined with a production Castor-2 installation as a Hierarchical Storage Manager solution to address data storage, dat access and custodial responsibility. After the prompt reconstruction phase at the Tier-0, the data was distributed to all participating Tier-1’s, and calibration/alignment, re-reconstruction and skimming jobs ran at the Tier-1’s. Output of skimming jobs were propagated to the Tier-2’s, to allow physics analysis job submissions. The experience collected by the INFN Tier-1 storage group during the pre-challenge Monte Carlo production, the preparation and the running of the CSA06 exercise - as well as the Ti...

  20. Computational experience with sequential and parallel, preconditioned Jacobi-Davidson for large, sparse symmetric matrices

    International Nuclear Information System (INIS)

    Bergamaschi, Luca; Pini, Giorgio; Sartoretto, Flavio

    2003-01-01

    The Jacobi-Davidson (JD) algorithm was recently proposed for evaluating a number of the eigenvalues of a matrix. JD goes beyond pure Krylov-space techniques; it cleverly expands its search space, by solving the so-called correction equation, thus in principle providing a more powerful method. Preconditioning the Jacobi-Davidson correction equation is mandatory when large, sparse matrices are analyzed. We considered several preconditioners: Classical block-Jacobi, and IC(0), together with approximate inverse (AINV or FSAI) preconditioners. The rationale for using approximate inverse preconditioners is their high parallelization potential, combined with their efficiency in accelerating the iterative solution of the correction equation. Analysis was carried on the sequential performance of preconditioned JD for the spectral decomposition of large, sparse matrices, which originate in the numerical integration of partial differential equations arising in physical and engineering problems. It was found that JD is highly sensitive to preconditioning, and it can display an irregular convergence behavior. We parallelized JD by data-splitting techniques, combining them with techniques to reduce the amount of communication data. Our own parallel, preconditioned code was executed on a dedicated parallel machine, and we present the results of our experiments. Our JD code provides an appreciable parallel degree of computation. Its performance was also compared with those of PARPACK and parallel DACG

  1. Experiment on a novel user input for computer interface utilizing tongue input for the severely disabled.

    Science.gov (United States)

    Kencana, Andy Prima; Heng, John

    2008-11-01

    This paper introduces a novel passive tongue control and tracking device. The device is intended to be used by the severely disabled or quadriplegic person. The main focus of this device when compared to the other existing tongue tracking devices is that the sensor employed is passive which means it requires no powered electrical sensor to be inserted into the user's mouth and hence no trailing wires. This haptic interface device employs the use of inductive sensors to track the position of the user's tongue. The device is able perform two main PC functions that of the keyboard and mouse function. The results show that this device allows the severely disabled person to have some control in his environment, such as to turn on and off or control daily electrical devices or appliances; or to be used as a viable PC Human Computer Interface (HCI) by tongue control. The operating principle and set-up of such a novel passive tongue HCI has been established with successful laboratory trials and experiments. Further clinical trials will be required to test out the device on disabled persons before it is ready for future commercial development.

  2. The growth of language: Universal Grammar, experience, and principles of computation.

    Science.gov (United States)

    Yang, Charles; Crain, Stephen; Berwick, Robert C; Chomsky, Noam; Bolhuis, Johan J

    2017-10-01

    Human infants develop language remarkably rapidly and without overt instruction. We argue that the distinctive ontogenesis of child language arises from the interplay of three factors: domain-specific principles of language (Universal Grammar), external experience, and properties of non-linguistic domains of cognition including general learning mechanisms and principles of efficient computation. We review developmental evidence that children make use of hierarchically composed structures ('Merge') from the earliest stages and at all levels of linguistic organization. At the same time, longitudinal trajectories of development show sensitivity to the quantity of specific patterns in the input, which suggests the use of probabilistic processes as well as inductive learning mechanisms that are suitable for the psychological constraints on language acquisition. By considering the place of language in human biology and evolution, we propose an approach that integrates principles from Universal Grammar and constraints from other domains of cognition. We outline some initial results of this approach as well as challenges for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Computational Experiments on the Step and Frequency Responses of a Three-Axis Thermal Accelerometer

    Directory of Open Access Journals (Sweden)

    Yoshifumi Ogami

    2017-11-01

    Full Text Available The sensor response has been reported to become highly nonlinear when the acceleration added to a thermal accelerator is very large, so the same response can be observed for two accelerations with different magnitudes and opposite signs. Some papers have reported the frequency response for the horizontal acceleration to be a first-order system, while others have reported it to be a second-order system. The response for the vertical acceleration has not been studied. In this study, computational experiments were performed to examine the step and frequency responses of a three-axis thermal accelerometer. The results showed that monitoring the temperatures at two positions and making use of cross-axis sensitivity allow a unique acceleration to be determined even when the range of the vertical acceleration is very large (e.g., −10,000–10,000 g. The frequency response was proven to be a second-order system for horizontal acceleration and a third-order system for vertical acceleration.

  4. Comparisons of LES and RANS Computations with PIV Experiments on a Cylindrical Cavity Flow

    Directory of Open Access Journals (Sweden)

    Wen-Tao Su

    2013-01-01

    Full Text Available A comparison study on the numerical computations by large eddy simulation (LES and Reynolds-averaged Navier-Stokes (RANS methods with experiment on a cylindrical cavity flow was conducted in this paper. Numerical simulations and particle image velocimetry (PIV measurement were performed for two Reynolds numbers of the flow at a constant aspect ratio of H/R = 2.4 (R is the radius of the cylindrical cavity, and H is liquid level. The three components of velocity were extracted from 100 sequential PIV measured velocity frames with averaging, in order to illustrate the axial jet flow evolution and circulation distribution in the radial direction. The results show that LES can reproduce well the fine structure inside the swirling motions in both the meridional and the horizontal planes, as well as the distributions of velocity components and the circulation, in good agreement with experimental results, while the RANS method only provided a rough trend of inside vortex structure. Based on the analysis of velocity profiles at various locations, it indicates that LES is more suitable for predicting the complex flow characteristics inside complicated three-dimensional geometries.

  5. Computer-aided digitization of graphical mass flow data from the 1/5-scale Mark I BWR pressure suppression experiment

    International Nuclear Information System (INIS)

    Holman, G.S.; McCauley, E.W.

    1979-01-01

    Periodically in the analysis of engineering data, it becomes necessary to use graphical output as the solitary source of accurate numerical data for use in subsequent calculations. Such was our experience in the extended analysis of data from the 1/5-scale Mark I boiling water reactor pressure suppression experiment (PSE). The original numerical results of extensive computer calculations performed at the time of the actual PSE tests and required for the later extended analysis program had not been retained as archival records. We were, therefore, required to recover the previously calculated data, either by a complete recalculation or from available computer graphics records. Time constraints suggested recovery from the graphics records as the more viable approach. This report describes two different approaches to recovery of digital data from graphics records. One, combining hard and software techniques immediately available to us at LLL, proved to be inadequate for our purposes. The other approach required the development of pure software techniques that interfaced with LLL computer graphics to unpack digital coordinate information directly from graphics files. As a result of this effort, we were able to recover the required data with no significant loss in the accuracy of the original calculations

  6. Experiments on Utilization of JAPAN/MARC by a Personal Computer

    Science.gov (United States)

    Asakura, Syuzo

    Realizing JAPAN/MARC for a personal computer, classification of books and collection of bibliographic data assisted with JAPAN/MARC become easy. In this paper, an experimental method of transforming JAPAN/MARC from tape to MS-DOS floppy disk is described in detail. The standard record form for a personal computer and the exchange record form for database language is proposed. The summary of the results are : JAPAN/MARC becomes available to a personal computer. The new record forms make it easy to use JAPAN/MARC by a personal computer and to exchange bibliographic data to other personal computer systems for a library.

  7. Elucidating reactivity regimes in cyclopentane oxidation: Jet stirred reactor experiments, computational chemistry, and kinetic modeling

    KAUST Repository

    Rachidi, Mariam El

    2016-06-23

    This study is concerned with the identification and quantification of species generated during the combustion of cyclopentane in a jet stirred reactor (JSR). Experiments were carried out for temperatures between 740 and 1250K, equivalence ratios from 0.5 to 3.0, and at an operating pressure of 10atm. The fuel concentration was kept at 0.1% and the residence time of the fuel/O/N mixture was maintained at 0.7s. The reactant, product, and intermediate species concentration profiles were measured using gas chromatography and Fourier transform infrared spectroscopy. The concentration profiles of cyclopentane indicate inhibition of reactivity between 850-1000K for ϕ = 2.0 and ϕ = 3.0. This behavior is interesting, as it has not been observed previously for other fuel molecules, cyclic or non-cyclic. A kinetic model including both low- and high-temperature reaction pathways was developed and used to simulate the JSR experiments. The pressure-dependent rate coefficients of all relevant reactions lying on the PES of cyclopentyl+O, as well as the C-C and C-H scission reactions of the cyclopentyl radical were calculated at the UCCSD(T)-F12b/cc-pVTZ-F12//M06-2X/6-311++G(d,p) level of theory. The simulations reproduced the unique reactivity trend of cyclopentane and the measured concentration profiles of intermediate and product species. Sensitivity and reaction path analyses indicate that this reactivity trend may be attributed to differences in the reactivity of allyl radical at different conditions, and it is highly sensitive to the C-C/C-H scission branching ratio of the cyclopentyl radical decomposition.

  8. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  13. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  14. Using the computer-driven VR environment to promote experiences of natural world immersion

    Science.gov (United States)

    Frank, Lisa A.

    2013-03-01

    In December, 2011, over 800 people experienced the exhibit, :"der"//pattern for a virtual environment, created for the fully immersive CAVETM at the University of Wisconsin-Madison. This exhibition took my nature-based photographic work and reinterpreted it for virtual reality (VR).Varied responses such as: "It's like a moment of joy," or "I had to see it twice," or "I'm still thinking about it weeks later" were common. Although an implied goal of my 2D artwork is to create a connection that makes viewers more aware of what it means to be a part of the natural world, these six VR environments opened up an unexpected area of inquiry that my 2D work has not. Even as the experience was mediated by machines, there was a softening at the interface between technology and human sensibility. Somehow, for some people, through the unlikely auspices of a computer-driven environment, the project spoke to a human essence that they connected with in a way that went beyond all expectations and felt completely out of my hands. Other interesting behaviors were noted: in some scenarios some spoke of intense anxiety, acrophobia, claustrophobia-even fear of death when the scene took them underground. These environments were believable enough to cause extreme responses and disorientation for some people; were fun, pleasant and wonder-filled for most; and were liberating, poetic and meditative for many others. The exhibition seemed to promote imaginative skills, creativity, emotional insight, and environmental sensitivity. It also revealed the CAVETM to be a powerful tool that can encourage uniquely productive experiences. Quite by accident, I watched as these nature-based environments revealed and articulated an essential relationship between the human spirit and the physical world. The CAVETM is certainly not a natural space, but there is clear potential to explore virtual environments as a path to better and deeper connections between people and nature. We've long associated contact

  15. Gaining Efficiency of Computational Experiments in Modeling the Flight Vehicle Movement

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2017-01-01

    Full Text Available The paper considers one of the important aspects to gain efficiency of conducted computational experiments, namely to provide grid optimization. The problem solution will ultimately create a more perfect system, because just a multivariate simulation is a basis to apply optimization methods by the specified criteria and to identify problems in functioning of technical systems.The paper discusses a class of the moving objects, representing a body of revolution, which, for one reason or another, endures deformation of casing. Analyses using the author's techniques have shown that there are the following complex functional dependencies of aerodynamic characteristics of the studied class of deformed objects.Presents a literature review on new ways for organizing the calculations, data storage and transfer. Provides analysing the methods of forming grids, including those used in initial calculations and visualization of information. In addition to the regular grids, are offered unstructured grids, including those for dynamic spatial-temporal information. Attention is drawn to the problem of an efficient retrieval of information. The paper shows a relevant capability to run with large data volumes, including an OLAP technology, multidimensional cubes (Data Cube, and finally, an integrated Date Mining approach.Despite the huge number of successful modern approaches to the solution of problems of formation, storage and processing of multidimensional data, it should be noted that computationally these tools are quite expensive. Expenditure for using the special tools often exceeds the cost of directly conducted computational experiments as such. In this regard, it was recognized that it is unnecessary to abandon the use of traditional tools and focus on a direct increase of their efficiency. Within the framework of the applied problem under consideration such a tool was to form the optimal grids.The optimal grid was understood to be a grid in the N

  16. Individualized computer-aided education in mammography based on user modeling: concept and preliminary experiments.

    Science.gov (United States)

    Mazurowski, Maciej A; Baker, Jay A; Barnhart, Huiman X; Tourassi, Georgia D

    2010-03-01

    The authors propose the framework for an individualized adaptive computer-aided educational system in mammography that is based on user modeling. The underlying hypothesis is that user models can be developed to capture the individual error making patterns of radiologists-in-training. In this pilot study, the authors test the above hypothesis for the task of breast cancer diagnosis in mammograms. The concept of a user model was formalized as the function that relates image features to the likelihood/extent of the diagnostic error made by a radiologist-in-training and therefore to the level of difficulty that a case will pose to the radiologist-in-training (or "user"). Then, machine learning algorithms were implemented to build such user models. Specifically, the authors explored k-nearest neighbor, artificial neural networks, and multiple regression for the task of building the model using observer data collected from ten Radiology residents at Duke University Medical Center for the problem of breast mass diagnosis in mammograms. For each resident, a user-specific model was constructed that predicts the user's expected level of difficulty for each presented case based on two BI-RADS image features. In the experiments, leave-one-out data handling scheme was applied to assign each case to a low-predicted-difficulty or a high-predicted-difficulty group for each resident based on each of the three user models. To evaluate whether the user model is useful in predicting difficulty, the authors performed statistical tests using the generalized estimating equations approach to determine whether the mean actual error is the same or not between the low-predicted-difficulty group and the high-predicted-difficulty group. When the results for all observers were pulled together, the actual errors made by residents were statistically significantly higher for cases in the high-predicted-difficulty group than for cases in the low-predicted-difficulty group for all modeling

  17. Optically stimulated luminescence sensitivity changes in quartz due to repeated use in single aliquot readout: Experiments and computer simulations

    DEFF Research Database (Denmark)

    McKeever, S.W.S.; Bøtter-Jensen, L.; Agersnap Larsen, N.

    1996-01-01

    As part of a study to examine sensitivity changes in single aliquot techniques using optically stimulated luminescence (OSL) a series of experiments has been conducted with single aliquots of natural quartz, and the data compared with the results of computer simulations of the type of processes...

  18. Influences of Gender and Computer Gaming Experience in Occupational Desktop Virtual Environments: A Cross-Case Analysis Study

    Science.gov (United States)

    Ausburn, Lynna J.; Ausburn, Floyd B.; Kroutter, Paul J.

    2013-01-01

    This study used a cross-case analysis methodology to compare four line-of-inquiry studies of desktop virtual environments (DVEs) to examine the relationships of gender and computer gaming experience to learning performance and perceptions. Comparison was made of learning patterns in a general non-technical DVE with patterns in technically complex,…

  19. Using Educational Games and Simulation Software in a Computer Science Course: Learning Achievements and Student Flow Experiences

    Science.gov (United States)

    Liu, Tsung-Yu

    2016-01-01

    This study investigates how educational games impact on students' academic performance and multimedia flow experiences in a computer science course. A curriculum consists of five basic learning units, that is, the stack, queue, sort, tree traversal, and binary search tree, was conducted for 110 university students during one semester. Two groups…

  20. Experiences in evaluating outcomes in tool-based, competence building education in dynamical systems using symbolic computer algebra

    DEFF Research Database (Denmark)

    Perram, John; Andersen, Morten; Ellerkilde, Lars

    2005-01-01

    This paper discusses experience with alternative assessment strategies for an introductory course in dynamical systems, where the use of computer algebra and calculus is fully integrated into the learning process, so that the standard written examination would not be appropriate. Instead, students...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  2. Previously unknown organomagnesium compounds in astrochemical context

    OpenAIRE

    Ruf, Alexander

    2018-01-01

    We describe the detection of dihydroxymagnesium carboxylates (CHOMg) in astrochemical context. CHOMg was detected in meteorites via ultrahigh-resolving chemical analytics and represents a novel, previously unreported chemical class. Thus, chemical stability was probed via quantum chemical computations, in combination with experimental fragmentation techniques. Results propose the putative formation of green-chemical OH-Grignard-type molecules and triggered fundamental questions within chemica...

  3. Modeling error in assessment of mammographic image features for improved computer-aided mammography training: initial experience

    Science.gov (United States)

    Mazurowski, Maciej A.; Tourassi, Georgia D.

    2011-03-01

    In this study we investigate the hypothesis that there exist patterns in erroneous assessment of BI-RADS image features among radiology trainees when performing diagnostic interpretation of mammograms. We also investigate whether these error making patterns can be captured by individual user models. To test our hypothesis we propose a user modeling algorithm that uses the previous readings of a trainee to identify whether certain BI-RADS feature values (e.g. "spiculated" value for "margin" feature) are associated with higher than usual likelihood that the feature will be assessed incorrectly. In our experiments we used readings of 3 radiology residents and 7 breast imaging experts for 33 breast masses for the following BI-RADS features: parenchyma density, mass margin, mass shape and mass density. The expert readings were considered as the gold standard. Rule-based individual user models were developed and tested using the leave one-one-out crossvalidation scheme. Our experimental evaluation showed that the individual user models are accurate in identifying cases for which errors are more likely to be made. The user models captured regularities in error making for all 3 residents. This finding supports our hypothesis about existence of individual error making patterns in assessment of mammographic image features using the BI-RADS lexicon. Explicit user models identifying the weaknesses of each resident could be of great use when developing and adapting a personalized training plan to meet the resident's individual needs. Such approach fits well with the framework of adaptive computer-aided educational systems in mammography we have proposed before.

  4. Computed tomography guided needle biopsy: experience from 1,300 procedures

    Energy Technology Data Exchange (ETDEWEB)

    Chojniak, Rubens; Isberner, Rony Klaus; Viana, Luciana Marinho; Yu, Liao Shin; Aita, Alessandro Amorim; Soares, Fernando Augusto [Hospital do Cancer A.C. Camargo, Sao Paulo, SP (Brazil). Dept. de Radiologia e Patologia

    2006-01-15

    Context and objective: computed tomography (CT) guided biopsy is widely accepted as effective and safe for diagnosis in many settings. Accuracy depends on target organ and needle type. Cutting needles present advantages over fine needles. This study presents experience from CT guided biopsies performed at an oncology center. Design and setting: retrospective study at Hospital do Cancer A. C. Camargo, Sao Paulo.Methods: 1,300 consecutive CT guided biopsies performed between July 1994 and February 2000 were analyzed. Nodules or masses were suspected as primary malignancy in 845 cases (65%) or metastatic lesion in 455 (35%). 628 lesions were thoracic, 281 abdominal, 208 retroperitoneal, 134 musculoskeletal and 49 head/neck. All biopsies were performed by one radiologist or under his supervision: 765 (59%) with 22-gauge fine-needle/aspiration technique and 535 (41%) with automated 16 or 18-gauge cutting-needle biopsy. Results: adequate samples were obtained in 70-92% of fine-needle and 93-100% of cutting-needle biopsies. The specific diagnosis rates were 54-67% for fine-needle and 82-100% for cutting-needle biopsies, according to biopsy site. For any site, sample adequacy and specific diagnosis rate were always better for cutting-needle biopsy. Among 530 lung biopsies, there were 84 pneumothorax (16%) and two hemothorax (0.3%) cases, with thoracic drainage in 24 (4.9%). Among abdominal and retroperitoneal biopsies, there were two cases of major bleeding and one of peritonitis. Conclusion: both types of needle showed satisfactory results, but cutting-needle biopsy should be used when specific diagnosis is desired without greater incidence of complications. (author)

  5. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  6. Laboratory Grouping Based on Previous Courses.

    Science.gov (United States)

    Doemling, Donald B.; Bowman, Douglas C.

    1981-01-01

    In a five-year study, second-year human physiology students were grouped for laboratory according to previous physiology and laboratory experience. No significant differences in course or board examination performance were found, though correlations were found between predental grade-point averages and grouping. (MSE)

  7. The Effect of Interactive Computer Animations Accompanied with Experiments on Grade 6th Students’ Achievements and Attitudes toward Science

    Directory of Open Access Journals (Sweden)

    E. Akpınar

    2007-06-01

    Full Text Available The aim of the present study was to investigate the effect of instruction including interactive computer animations accompanied with experiments over traditionally designed Instruction in 6th grade students’ physics achievement and attitudes toward science. In this study, a quasi-experimental pretest-posttest design was used. As a data collection instruments, physics achievement test and attitude scale toward science were administered to experimental and control group. In the experimental group, the materials were used while doing the experiments and then they did the same experiments interactively on the computer. In the control group, the experiments were done only by materials and the students did not use the computer during the experiments. The findings indicated that there was no significant difference between groups with respect to achievement before the treatment and there was a significant difference between the groups in favor of experimental groups after the treatment. The means of attitude towards science showed no significant difference between the groups before and after treatment.

  8. Static Computer Memory Integrity Testing (SCMIT): An experiment flown on STS-40 as part of GAS payload G-616

    Science.gov (United States)

    Hancock, Thomas

    1993-01-01

    This experiment investigated the integrity of static computer memory (floppy disk media) when exposed to the environment of low earth orbit. The experiment attempted to record soft-event upsets (bit-flips) in static computer memory. Typical conditions that exist in low earth orbit that may cause soft-event upsets include: cosmic rays, low level background radiation, charged fields, static charges, and the earth's magnetic field. Over the years several spacecraft have been affected by soft-event upsets (bit-flips), and these events have caused a loss of data or affected spacecraft guidance and control. This paper describes a commercial spin-off that is being developed from the experiment.

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  10. Mind the Sheep! User Experience Evaluation & Brain-Computer Interface Games

    NARCIS (Netherlands)

    Gürkök, Hayrettin

    2012-01-01

    A brain-computer interface (BCI) infers our actions (e.g. a movement), intentions (e.g. preparation for a movement) and psychological states (e.g. emotion, attention) by interpreting our brain signals. It uses the inferences it makes to manipulate a computer. Although BCIs have long been used

  11. Experiences Using an Open Source Software Library to Teach Computer Vision Subjects

    Science.gov (United States)

    Cazorla, Miguel; Viejo, Diego

    2015-01-01

    Machine vision is an important subject in computer science and engineering degrees. For laboratory experimentation, it is desirable to have a complete and easy-to-use tool. In this work we present a Java library, oriented to teaching computer vision. We have designed and built the library from the scratch with emphasis on readability and…

  12. Our experience in the diagnosis of aortic dissection by multislice computed tomography

    International Nuclear Information System (INIS)

    Llerena Rojas, Luis R; Mendoza Rodriguez, Vladimir; Olivares Aquiles, Eddy

    2011-01-01

    Aortic dissection (AD) is the most frequent and life-threatening acute aortic syndrome. Currently the more used method for the aortic study is the multislice computed tomography. The purpose of this paper is to expose the more relevant features in 22 patients with AD consecutively studied by multislice computed tomography

  13. Technological Metaphors and Moral Education: The Hacker Ethic and the Computational Experience

    Science.gov (United States)

    Warnick, Bryan R.

    2004-01-01

    This essay is an attempt to understand how technological metaphors, particularly computer metaphors, are relevant to moral education. After discussing various types of technological metaphors, it is argued that technological metaphors enter moral thought through their "functional descriptions." The computer metaphor is then explored by turning to…

  14. Using Tablet PCs in Classroom for Teaching Human-Computer Interaction: An Experience in High Education

    Science.gov (United States)

    da Silva, André Constantino; Marques, Daniela; de Oliveira, Rodolfo Francisco; Noda, Edgar

    2014-01-01

    The use of computers in the teaching and learning process is investigated by many researches and, nowadays, due the available diversity of computing devices, tablets are become popular in classroom too. So what are the advantages and disadvantages to use tablets in classroom? How can we shape the teaching and learning activities to get the best of…

  15. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  16. Synthetic event-related potentials: a computational bridge between neurolinguistic models and experiments.

    Science.gov (United States)

    Barrès, Victor; Simons, Arthur; Arbib, Michael

    2013-01-01

    Our previous work developed Synthetic Brain Imaging to link neural and schema network models of cognition and behavior to PET and fMRI studies of brain function. We here extend this approach to Synthetic Event-Related Potentials (Synthetic ERP). Although the method is of general applicability, we focus on ERP correlates of language processing in the human brain. The method has two components: Phase 1: To generate cortical electro-magnetic source activity from neural or schema network models; and Phase 2: To generate known neurolinguistic ERP data (ERP scalp voltage topographies and waveforms) from putative cortical source distributions and activities within a realistic anatomical model of the human brain and head. To illustrate the challenges of Phase 2 of the methodology, spatiotemporal information from Friederici's 2002 model of auditory language comprehension was used to define cortical regions and time courses of activation for implementation within a forward model of ERP data. The cortical regions from the 2002 model were modeled using atlas-based masks overlaid on the MNI high definition single subject cortical mesh. The electromagnetic contribution of each region was modeled using current dipoles whose position and orientation were constrained by the cortical geometry. In linking neural network computation via EEG forward modeling to empirical results in neurolinguistics, we emphasize the need for neural network models to link their architecture to geometrically sound models of the cortical surface, and the need for conceptual models to refine and adopt brain-atlas based approaches to allow precise brain anchoring of their modules. The detailed analysis of Phase 2 sets the stage for a brief introduction to Phase 1 of the program, including the case for a schema-theoretic approach to language production and perception presented in detail elsewhere. Unlike Dynamic Causal Modeling (DCM) and Bojak's mean field model, Synthetic ERP builds on models of networks

  17. Enhancing the Programming Experience for First-Year Engineering Students through Hands-On Integrated Computer Experiences

    Science.gov (United States)

    Canfield, Stephen L.; Ghafoor, Sheikh; Abdelrahman, Mohamed

    2012-01-01

    This paper describes the redesign and implementation of the course, "Introduction to Programming for Engineers" using microcontroller (MCU) hardware as the programming target. The objective of this effort is to improve the programming competency for engineering students by more closely relating the initial programming experience to the student's…

  18. Hanging out in the game café : Contextualising co-located computer game play practices and experiences

    OpenAIRE

    Jonsson, Fatima

    2012-01-01

    What social practices are people involved when staying in a game café? What kind of social setting is the game café? What are the attitudes towards playing computer games at home and in public among parents? What are the media representations of co-located game playing in public? What are the sensory experiences of playing co-located game play in public? This dissertation gives a descriptive and analytical account of the contexts and meanings of playing co-located computer games in public set...

  19. Golimumab in patients with active rheumatoid arthritis who have previous experience with tumour necrosis factor inhibitors: results of a long-term extension of the randomised, double-blind, placebo-controlled GO-AFTER study through week 160

    NARCIS (Netherlands)

    Smolen, Josef S.; Kay, Jonathan; Landewé, Robert B. M.; Matteson, Eric L.; Gaylis, Norman; Wollenhaupt, Jurgen; Murphy, Frederick T.; Zhou, Yiying; Hsia, Elizabeth C.; Doyle, Mittie K.

    2012-01-01

    The aim of this study was to assess long-term golimumab therapy in patients with rheumatoid arthritis (RA) who discontinued previous tumour necrosis factor alpha (TNFα) inhibitor(s) for any reason. Results through week 24 of this multicentre, randomised, double-blind, placebo-controlled study of

  20. Computer models of dipole magnets of a series 'VULCAN' for the ALICE experiment

    International Nuclear Information System (INIS)

    Vodop'yanov, A.S.; Shishov, Yu.A.; Yuldasheva, M.B.; Yuldashev, O.I.

    1998-01-01

    The paper is devoted to a construction of computer models for three magnets of the 'VULCAN' series in the framework of a differential approach for two scalar potentials. The distinctive property of these magnets is that they are 'warm' and their coils are of conic saddle shape. The algorithm of creating a computer model for the coils is suggested. The coil field is computed by Biot-Savart law and a part of the integrals is calculated with the help of analytical formulas. To compute three-dimensional magnetic fields by the finite element method with a local accuracy control, two new algorithms are suggested. The former is based on a comparison of the fields computed by means of linear and quadratic shape functions. The latter is based on a comparison of the field computed with the help of linear shape functions and a local classical solution. The distributions of the local accuracy control characteristics within a working part of the third magnet and the other results of the computations are presented

  1. Experience in programming Assembly language of CDC CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Caldeira, A.D.

    1987-10-01

    Aiming to optimize processing time of BCG computer code in the CDC CYBER 170/750 computer, the FORTRAN-V language of INTERP subroutine was converted to Assembly language. The BCG code was developed for solving neutron transport equation by iterative method, and the INTERP subroutine is innermost loop of the code carrying out 5 interpolation types. The central processor unit Assembly language of the CDC CYBER 170/750 computer and its application in implementing the interpolation subroutine of BCG code are described. (M.C.K.)

  2. Tensor Arithmetic, Geometric and Mathematic Principles of Fluid Mechanics in Implementation of Direct Computational Experiments

    Directory of Open Access Journals (Sweden)

    Bogdanov Alexander

    2016-01-01

    Full Text Available The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.

  3. EXPERIENCE OF USING CLOUD COMPUTING IN NETWORK PRODUCTS FOR SCHOOL EDUCATION

    Directory of Open Access Journals (Sweden)

    L. Sokolova

    2011-05-01

    Full Text Available We study data on the use of sites in the middle grades, secondary school, and their influence on the formation of information culture of students and their level of training. Sites use a technology called "cloud computing in Google, accessible from any internet-connected computer and do not require the use of resources of the computer itself. Sites are devoid of any advertising, does not require periodic backup, protection and general operation of the system administrator. This simplifies their use in the educational process for schools of different levels. A statistical analysis of the site was done, identified the main trends of their use.

  4. Stereotactic biopsy aided by a computer graphics workstation: experience with 200 consecutive cases.

    Science.gov (United States)

    Ulm, A J; Bova, F J; Friedman, W A

    2001-12-01

    The advent of modern computer technology has made it possible to examine not just the target point, but the entire trajectory in planning for stereotactic biopsies. Two hundred consecutive biopsies were performed by one surgeon, utilizing a computer graphics workstation. The target point, entry point, and complete trajectory were carefully scrutinized and adjusted to minimize potential complications. Pathologically abnormal tissue was obtained in 197 cases (98.5%). There was no mortality in this series. Symptomatic hemorrhages occurred in 4 cases (2%). Computer graphics workstations facilitate safe and effective biopsies in virtually any brain area.

  5. New Chicago-Indiana computer network will handle dataflow from world's largest scientific experiment

    CERN Multimedia

    2006-01-01

    "Massive quantities of data will soon begin flowing from the largest scientific instrument ever built into an international netword of computer centers, including one operated jointly by the University of Chicago and Indiana University." (1,5 page)

  6. Integration of computer technology into the medical curriculum: the King's experience

    Directory of Open Access Journals (Sweden)

    Vickie Aitken

    1997-12-01

    Full Text Available Recently, there have been major changes in the requirements of medical education which have set the scene for the revision of medical curricula (Towle, 1991; GMC, 1993. As part of the new curriculum at King's, the opportunity has been taken to integrate computer technology into the course through Computer-Assisted Learning (CAL, and to train graduates in core IT skills. Although the use of computers in the medical curriculum has up to now been limited, recent studies have shown encouraging steps forward (see Boelen, 1995. One area where there has been particular interest is the use of notebook computers to allow students increased access to IT facilities (Maulitz et al, 1996.

  7. Computational experience with a dual backtrack algorithm for identifying frequencies likely to create intermodulation problems

    Science.gov (United States)

    Morito, S.; Salkin, H. M.; Mathur, K.

    1981-02-01

    This paper describes the results of a computational study using a particular enumeration procedure, called a backtrack algorithm, to find the lowest order of radio-frequency intermodulation. The average lowest order and its standard deviation, the average computer time and its standard deviation, along with other relevant statistics are obtained for a series of randomly generated problems with sets of five to 75 threat or source frequencies. Other parameters, such as the guard band, the maximum number of concurrent threats, and the size of the frequency band on the lowest order of intermodulation are varied during the computations. Statistics for these computer runs, along with those relating to terminating the algorithm when the lowest 'acceptable' order is reached, are presented in some detail. Brief conclusions follow a listing of the results.

  8. Assessment of medical communication skills by computer: assessment method and student experiences

    NARCIS (Netherlands)

    Hulsman, R. L.; Mollema, E. D.; Hoos, A. M.; de Haes, J. C. J. M.; Donnison-Speijer, J. D.

    2004-01-01

    BACKGROUND A computer-assisted assessment (CAA) program for communication skills designated ACT was developed using the objective structured video examination (OSVE) format. This method features assessment of cognitive scripts underlying communication behaviour, a broad range of communication

  9. Foreign Experience in the Use of Computer Games in Teaching Children

    Directory of Open Access Journals (Sweden)

    Grigoryev I.S.,

    2017-01-01

    Full Text Available Compares games as one of the most interesting phenomena related to the computerization are the subject of many foreign and domestic psychological researches. The article presents the characteristics of the following international study destinations of computer (video games: firstly, the scope of use of computer games in education, secondly, study computer’s game influence of the cognitive domain of children, as well as formation of different skills. Such studies, however, do not consider computer games as an object, and stop only at specific areas of attention or perception. We discussed the question about common conceptual and methodological basis for the construction of research, which will classify and interpret the private research in this area. It lists the various (both positive and negative effects on the influence of computer games on the mental development of the player, their significant developmental and educational potential.

  10. Laminar Boundary-Layer Instabilities on Hypersonic Cones: Computations for Benchmark Experiments

    National Research Council Canada - National Science Library

    Robarge, Tyler W; Schneider, Steven P

    2005-01-01

    .... The STABL code package and its PSE-Chem stability solver are used to compute first and second mode instabilities for both sharp and blunt cones at wind tunnel conditions, with laminar mean flows...

  11. A Beowulf-class computing cluster for the Monte Carlo production of the LHCb experiment

    CERN Document Server

    Avoni, G; Bertin, A; Bruschi, M; Capponi, M; Carbone, A; Collamati, A; De Castro, S; Fabbri, Franco Luigi; Faccioli, P; Galli, D; Giacobbe, B; Lax, I; Marconi, U; Massa, I; Piccinini, M; Poli, M; Semprini-Cesari, N; Spighi, R; Vagnoni, V M; Vecchi, S; Villa, M; Vitale, A; Zoccoli, A

    2003-01-01

    The computing cluster built at Bologna to provide the LHCb Collaboration with a powerful Monte Carlo production tool is presented. It is a performance oriented Beowulf-class cluster, made of rack mounted commodity components, designed to minimize operational support requirements and to provide full and continuous availability of the computing resources. In this paper we describe the architecture of the cluster, and discuss the technical solutions adopted for each specialized sub-system.

  12. Computer systems experiences of users with and without disabilities an evaluation guide for professionals

    CERN Document Server

    Borsci, Simone; Federici, Stefano; Mele, Maria Laura

    2013-01-01

    This book provides the necessary tools for the evaluation of the interaction between the user who is disabled and the computer system that was designed to assist that person. The book creates an evaluation process that is able to assess the user's satisfaction with a developed system. Presenting a new theoretical perspective in the human computer interaction evaluation of disabled persons, it takes into account all of the individuals involved in the evaluation process.

  13. More than 2 years' experience with computer-aided irradiation planning in clinical routine

    International Nuclear Information System (INIS)

    Heller, H.; Rathje, J.

    1976-01-01

    This is a report on an irradiation planning system which has been used for about 2 years in the department of radiotherapy in the general hospital in Altona. Hard- and software as well as the mathematical model for the description of the dose distribution are described. The compromise between the required accuray of the irradiation plan and the investment in computer-technical activities and computer time is discussed. (orig./LN) [de

  14. Optical conoscopy of distorted uniaxial liquid crystals: computer simulation and experiment

    OpenAIRE

    Yu.A.Nastishin; O.B.Dovgyi; O.G.Vlokh

    2001-01-01

    We propose an algorithm to compute the conoscopic pattern for distorted uniaxial liquid crystal cells. The computed conoscopic figures for several cells (homeotropic, planar, twist, hybrid, hybrid under an external field) are compared to the corresponding experimental conoscopic patterns. We demonstrate that conoscopy can be used for the characterization of the distorted nematic cells with the director deformations which can not be detected and unambigously characterized by direct microscopy ...

  15. An Analysis of Creative Process Learning in Computer Game Activities Through Player Experiences

    OpenAIRE

    Wilawan Inchamnan

    2016-01-01

    This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning outcomes is described. Creative components were measured by examining task motivation and domain-relevant and creativity-relevant skill factors. The r...

  16. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    Energy Technology Data Exchange (ETDEWEB)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens [A. C. Camargo Cancer Center, Sao Paulo, SP (Brazil); Universidade Federal do Espirito Santo (HUCAM/UFES), Vitoria, ES (Brazil). Hospital Universitario Cassiano Antonio de Morais. Radiologia e Diagnostico por Imagem

    2017-03-15

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  17. Modeling Warm Dense Matter Experiments using the 3D ALE-AMR Code and the Move Toward Exascale Computing

    International Nuclear Information System (INIS)

    Koniges, A.; Eder, E.; Liu, W.; Barnard, J.; Friedman, A.; Logan, G.; Fisher, A.; Masers, N.; Bertozzi, A.

    2011-01-01

    The Neutralized Drift Compression Experiment II (NDCX II) is an induction accelerator planned for initial commissioning in 2012. The final design calls for a 3 MeV, Li+ ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and transverse dimension of order 1 mm. The NDCX II will be used in studies of material in the warm dense matter (WDM) regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. We discuss recent efforts to adapt the 3D ALE-AMR code to model WDM experiments on NDCX II. The code, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR), has physics models that include ion deposition, radiation hydrodynamics, thermal diffusion, anisotropic material strength with material time history, and advanced models for fragmentation. Experiments at NDCX-II will explore the process of bubble and droplet formation (two-phase expansion) of superheated metal solids using ion beams. Experiments at higher temperatures will explore equation of state and heavy ion fusion beam-to-target energy coupling efficiency. Ion beams allow precise control of local beam energy deposition providing uniform volumetric heating on a timescale shorter than that of hydrodynamic expansion. The ALE-AMR code does not have any export control restrictions and is currently running at the National Energy Research Scientific Computing Center (NERSC) at LBNL and has been shown to scale well to thousands of CPUs. New surface tension models that are being implemented and applied to WDM experiments. Some of the approaches use a diffuse interface surface tension model that is based on the advective Cahn-Hilliard equations, which allows for droplet breakup in divergent velocity fields without the need for imposed perturbations. Other methods require seeding or other methods for droplet breakup. We also briefly discuss the effects of the move to exascale computing and related

  18. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    International Nuclear Information System (INIS)

    Virtanen, E.; Haapalehto, T.; Kouhia, J.

    1997-01-01

    Three experiments were conducted to study the behaviour of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes so that the results may be compared. Only the steam generator was modeled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments. (orig.)

  19. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.; Haapalehto, T. [Lappeenranta Univ. of Technology, Lappeenranta (Finland); Kouhia, J. [VTT Energy, Nuclear Energy, Lappeenranta (Finland)

    1995-09-01

    Three experiments were conducted to study the behavior of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes to that the results may be compared. Only the steam generator was modelled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments.

  20. Potential impact of DOM accumulation on fCO2 and carbonate ion computations in ocean acidification experiments

    Directory of Open Access Journals (Sweden)

    A. Oschlies

    2012-10-01

    Full Text Available The internal consistency of measurements and computations of components of the CO2-system, namely total alkalinity (AT, total dissolved carbon dioxide (CT, CO2 fugacity (fCO2 and pH, has been confirmed repeatedly in open ocean studies when the CO2 system had been over determined. Differences between measured and computed properties, such as ΔfCO2 (= fCO2 (measured – fCO2 (computed from AT and CT / fCO2 (measured × 100, are usually below 5%. Recently, Hoppe et al. (2012 provided evidence of significantly larger ΔfCO2 in some experimental setups. These observations are currently not well understood. Here we discuss a case from a series of phytoplankton culture experiments with ΔfCO2 of up to about 25%. ΔfCO2 varied systematically during the course of these experiments and showed a clear correlation with the accumulation of dissolved organic matter (DOM. Culture and mesocosm experiments are often carried out under high initial nutrient concentrations, yielding high biomass concentrations that in turn often lead to a substantial build-up of DOM. In such experiments, DOM can reach concentrations much higher than typically observed in the open ocean. To the extent that DOM includes organic acids and bases, it will contribute to the alkalinity of the seawater contained in the experimental device. Our analysis suggests that whenever substantial amounts of DOM are produced during the experiment, standard computer programmes used to compute CO2 fugacity can underestimate true fCO2 significantly when the computation is based on AT and CT. Unless the effect of DOM-alkalinity can be accounted for, this might lead to significant errors in the interpretation of the system under consideration with respect to the experimentally applied CO2 perturbation. Errors in the inferred fCO2 can misguide the development of parameterisations used in simulations with global carbon cycle models in future CO2-scenarios. Over determination of the CO2-system in

  1. Reconstruction and identification of electrons in the Atlas experiment. Setup of a Tier 2 of the computing grid

    International Nuclear Information System (INIS)

    Derue, F.

    2008-03-01

    The origin of the mass of elementary particles is linked to the electroweak symmetry breaking mechanism. Its study will be one of the main efforts of the Atlas experiment at the Large Hadron Collider of CERN, starting in 2008. In most cases, studies will be limited by our knowledge of the detector performances, as the precision of the energy reconstruction or the efficiency to identify particles. This manuscript presents a work dedicated to the reconstruction of electrons in the Atlas experiment with simulated data and data taken during the combined test beam of 2004. The analysis of the Atlas data implies the use of a huge amount of computing and storage resources which brought to the development of a world computing grid. (author)

  2. A reminder on millisecond timing accuracy and potential replication failure in computer-based psychology experiments: An open letter.

    Science.gov (United States)

    Plant, Richard R

    2016-03-01

    There is an ongoing 'replication crisis' across the field of psychology in which researchers, funders, and members of the public are questioning the results of some scientific studies and the validity of the data they are based upon. However, few have considered that a growing proportion of research in modern psychology is conducted using a computer. Could it simply be that the hardware and software, or experiment generator, being used to run the experiment itself be a cause of millisecond timing error and subsequent replication failure? This article serves as a reminder that millisecond timing accuracy in psychology studies remains an important issue and that care needs to be taken to ensure that studies can be replicated on current computer hardware and software.

  3. H and other transfers in enzymes and in solution: theory and computations, a unified view. 2. Applications to experiment and computations.

    Science.gov (United States)

    Marcus, R A

    2007-06-21

    Equations obtained in part I for the free-energy barrier to one-step enzymatic reactions between bound reactants are discussed. The rate is expressed in terms of lambdao (protein reorganization energy), DeltaG(o) (standard free energy of reaction of the H-transfer step), bond breaking/bond forming term, w (work terms), and H-transmission property. Two alternative approximations for the coupling of the bond breaking/bond forming and protein are distinguished experimentally in favorable cases by the DeltaG(o) where the maximum deuterium kinetic isotope effect occurs. Plots of log rate versus DeltaG(o) and properties such as DeltaS* and DeltaS(o) are discussed. The weak or zero T-dependence of the kinetic isotope effect for wild-type enzymes operating under physiological conditions is interpreted in terms of vanishing (or isotopically insensitive) w plus transfer from the lowest H-state. Static and dynamic protein flexibility is discussed. While the many correlations accessible for electron transfers are not available for H-transfers in enzymes, a combination of experiment, computation, and analytical approaches can assist in evaluating the utility of the present equations and in suggesting further experiments and computations. A protein reorganization energy lambdao is obtained in the literature from the extended valence bond formalism where diabatic electronic states are used. A method is suggested for extracting it when instead a bond distance difference coordinate is used. The results may provide a bridge between the two approaches.

  4. Modeling warm dense matter experiments using the 3D ALE-AMR code and the move toward exascale computing

    Directory of Open Access Journals (Sweden)

    Koniges Alice

    2013-11-01

    Full Text Available The Neutralized Drift Compression Experiment II (NDCX II is an induction accelerator planned for initial commissioning in 2012. The final design calls for a 3 MeV, Li+ ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and transverse dimension of order 1 mm. The NDCX II will be used in studies of material in the warm dense matter (WDM regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. We discuss recent efforts to adapt the 3D ALE-AMR code to model WDM experiments on NDCX II. The code, which combines Arbitrary Lagrangian Eulerian (ALE hydrodynamics with Adaptive Mesh Refinement (AMR, has physics models that include ion deposition, radiation hydrodynamics, thermal diffusion, anisotropic material strength with material time history, and advanced models for fragmentation. Experiments at NDCX-II will explore the process of bubble and droplet formation (two-phase expansion of superheated metal solids using ion beams. Experiments at higher temperatures will explore equation of state and heavy ion fusion beam-to-target energy coupling efficiency. Ion beams allow precise control of local beam energy deposition providing uniform volumetric heating on a timescale shorter than that of hydrodynamic expansion. We also briefly discuss the effects of the move to exascale computing and related computational changes on general modeling codes in fusion.

  5. ISLAM PROJECT: Interface between the signals from various experiments of a Van Graaff accelerator and PDP 11/44 computer

    International Nuclear Information System (INIS)

    Martinez Piquer, T. A.; Yuste Santos, C.

    1986-01-01

    This paper describe an interface between the signals from an in-beam experiment of a Van de Graaff accelerator and a PDP 11/44 computer. The information corresponding to one spectrum is taken from one digital voltammeter and is processed by mean of an equipment controlled by a M6809 microprocessor. The software package has been developed in assembly language and has a size of 1/2 K. (Author) 12 refs

  6. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Science.gov (United States)

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  7. Integrating psychoeducation in a basic computer skills course for people suffering from social anxiety: participants' experiences

    Directory of Open Access Journals (Sweden)

    Löhr HD

    2011-08-01

    Full Text Available Hildegard D Löhr1,2, Jan H Rosenvinge1,3, Rolf Wynn2,41Division of General Psychiatry, University Hospital of North Norway, 2Telemedicine Research Group, Department of Clinical Medicine, Faculty of Health Sciences, 3Department of Psychology, Faculty of Health Sciences, University of Tromsø, 4Division of Addiction and Specialized Psychiatry, University Hospital of North Norway, Tromsø, NorwayAbstract: We describe a psychoeducational program integrated in a basic computer skills course for participants suffering from social anxiety. The two main aims of the course were: that the participants learn basic computer skills, and that the participants learn to cope better with social anxiety. Computer skills were taught by a qualified teacher. Psychoeducation and cognitive therapy skills, including topics such as anxiety coping, self-accept, and self-regulation, were taught by a clinical psychologist. Thirteen of 16 participants completed the course, which lasted 11 weeks. A qualitative analysis was performed, drawing on observations during the course and on interviews with the participants. The participants were positive about the integration of psychoeducation sessions in the computer course, and described positive outcomes for both elements, including improved computer skills, improved self-esteem, and reduced social anxiety. Most participants were motivated to undertake further occupational rehabilitation after the course.Keywords: cognitive therapy, information technology, occupational rehabilitation, psychoeducation, self-help, social anxiety

  8. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2017-12-01

    Full Text Available Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  9. Adubarroz: a brazilian experience for fertilization and liming recommendation of irrigated rice via computational system

    Directory of Open Access Journals (Sweden)

    Felipe de Campos Carmona

    Full Text Available ABSTRACT: Recommendations for fertilizing irrigated rice in southern Brazil have been constantly evolving over years. In this process, the influence of factors such as the development cycle of varieties and sowing period increased. Thus, computational tools that take these and others important aspects into account can potentiate the fertilization response of rice. This study describes the computer program "ADUBARROZ". The software provides recommendations of fertilizer rates and liming requirements of irrigated rice, based on information entered by the user. The system takes various factors that regulate the crop response to fertilization into account. A final report is established with the graphical representation of input management over time.

  10. [Personal experience with the use of a computer in an intensive care unit].

    Science.gov (United States)

    Gismondi, A; Colonna, S S; Micalella, F

    1982-01-01

    In January 1977, the computerized system Hewlett Parkard HP 5600 A was set up in the Resuscitation Center of the "V. Fazzi" Hospital. It is able to play the following functions: management of the data monitors, management of the laboratory data, management of the staff remarks, carrying out of special calculations, logging. The advantages shown by the presence of a computer in a Resuscitation Center justify, in the opinion of the Authors, the expenses and problems of the organization. In fact, the monitoring and computation of both the vital and physical data and of the laboratory parameters, are essential in the therapies selected for a critically ill patient.

  11. A high speed, selective multi-ADC to computer data transfer interface, for nuclear physics experiments

    International Nuclear Information System (INIS)

    Arctaedius, T.; Ekstroem, R.E.

    1986-08-01

    A link connecting up to fifteen Analog to Digital Converters with a computer, through a Direct Memory Access interface, is described. The interface decides which of the connected ADC:s that participate in an event, and transfers the output-data from these to the computer, accompanied with a 2-byte word identifying the participating ADC:s. This data format can be recorded on tape without further transformations, and is easy to unfold at the off-line analysis. Data transfer is accomplished in less than a few microseconds, which is made possible by the use of high speed TTL circuits. (authors)

  12. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    Science.gov (United States)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  13. Psychophysiological Assessment Of Fear Experience In Response To Sound During Computer Video Gameplay

    DEFF Research Database (Denmark)

    Garner, Tom Alexander; Grimshaw, Mark

    2013-01-01

    The potential value of a looping biometric feedback system as a key component of adaptive computer video games is significant. Psychophysiological measures are essential to the development of an automated emotion recognition program, capable of interpreting physiological data into models of affect...... and systematically altering the game environment in response. This article presents empirical data the analysis of which advocates electrodermal activity and electromyography as suitable physiological measures to work effectively within a computer video game-based biometric feedback loop, within which sound...

  14. ATLAS Distributed Computing experience and performance during the LHC Run-2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081160; The ATLAS collaboration

    2016-01-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of the Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of...

  15. (The evolution of) post-secondary education: a computational model and experiments

    Czech Academy of Sciences Publication Activity Database

    Ortmann, Andreas; Slobodyan, Sergey

    -, č. 355 (2008), s. 1-46 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:MSM0021620846 Keywords : post-secondary education * for-profit higher education providers * computational simulations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp355.pdf

  16. (The evolution of) post-secondary education: a computational model and experiments

    Czech Academy of Sciences Publication Activity Database

    Ortmann, Andreas; Slobodyan, Sergey

    -, č. 355 (2008), s. 1-46 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : post-secondary education * for-profit higher education providers * computational simulations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp355.pdf

  17. ATLAS Distributed Computing experience and performance during the LHC Run-2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081160; The ATLAS collaboration

    2017-01-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of the...

  18. Status of the Grid Computing for the ALICE Experiment in the Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Adamová, Dagmar; Chudoba, Jiří; Kouba, T.; Lorenzo, P.M.; Saiz, P.; Švec, Jan; Hampl, Josef

    2010-01-01

    Roč. 219, č. 7 (2010), s. 1-9 E-ISSN 1742-6596 Institutional research plan: CEZ:AV0Z10480505; CEZ:AV0Z10100521 Keywords : accelerators * PARTICLE PHYSICS * computer data analysis Subject RIV: BF - Elementary Particles and High Energy Physics

  19. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  20. COMPUTATIONAL EXPERIENCE IN SOLVING LARGE LINEAR MATRIX EQUATIONS FOR AUTOMATIC CONTROL

    Directory of Open Access Journals (Sweden)

    Vasile Sima

    2004-12-01

    Full Text Available State-of-the-art, uni-processor linear matrix equation solvers for automatic control computations are investigated and compared for various problem sizes. Generalpurpose SLICOT solvers are the most efficient ones for small-size problems, but they cannot compete for larger problems with specialized solvers designed for certain problem classes.

  1. African American Faculty Women Experiences of Underrepresentation in Computer Technology Positions in Higher Education

    Science.gov (United States)

    King, Dolores

    2013-01-01

    African American women are underrepresented in computer technology disciplines in institutions of higher education throughout the United States. Although equitable gender representation is progressing in most fields, much less information is available on why institutions are still lagging in workforce diversity, a problem which can be lessened by…

  2. Experiences of Student Mathematics-Teachers in Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Karatas, Ilhan

    2011-01-01

    Computer technology in mathematics education enabled the students find many opportunities for investigating mathematical relationships, hypothesizing, and making generalizations. These opportunities were provided to pre-service teachers through a faculty course. At the end of the course, the teachers were assigned project tasks involving…

  3. CEGB philosophy and experience with fault-tolerant micro-computer application for power plant controls

    International Nuclear Information System (INIS)

    Clinch, D.A.L.

    1986-01-01

    From the mid-1960s until the late 1970s, automatic modulating control of the main boiler plant on CEGB fossil-fired power stations was largely implemented with hard wired electronic equipment. Mid-way through this period, the CEGB formulated a set of design requirements for this type of equipment; these laid particular emphasis on the fault tolerance of a control system and specified the nature of the interfaces with a control desk and with plant regulators. However, the automatic control of an Advanced Gas Cooled Reactor (AGR) is based upon measured values which are derived by processing a large number of thermocouple signals. This is more readily implemented digitally than with hard-wired equipment. Essential to the operation of an AGR power station is a data processing (DP) computer for monitoring the plant; so the first group of AGR power stations, designed in the 1960s, employed their DP computers for modulating control. Since the late 1970s, automatic modulating control of major plants, for new power stations and for re-fits on established power stations, has been implemented with micro-computers. Wherever practicable, the policy formulated earlier for hard-wired equipment has been retained, particularly in respect of the interfaces. This policy forms the foundation of the fault tolerance of these micro-computer systems

  4. Solution of the Schrodinger Equation for One-Dimensional Anharmonic Potentials: An Undergraduate Computational Experiment

    Science.gov (United States)

    Beddard, Godfrey S.

    2011-01-01

    A method of solving the Schrodinger equation using a basis set expansion is described and used to calculate energy levels and wavefunctions of the hindered rotation of ethane and the ring puckering of cyclopentene. The calculations were performed using a computer algebra package and the calculations are straightforward enough for undergraduates to…

  5. Computer aided instruction. Preliminary experience in the Radiological Sciences Institute of the University of Milan

    International Nuclear Information System (INIS)

    Gardani, G.; Bertoli, M.A.; Bellomi, M.

    1987-01-01

    Computerised instruction means teaching by computer using a program that alternates information with self-checking multiple choice questions. This system was used to create a fully computerized lesson on the diagnosis and treatment of breast cancer which was then tested on a small group of madical students attending the Radiology School of the Milan University Institute of Radiological Sciences. At the end of the test, the students were asked to complete a questionnaire which was then analysed. The computer lesson consisted of 66 text messages and 21 self-checking questions. It aroused considerable interest, though the most common reason was curiosity about a novel system. The degree of fatigue caused was modest despite the fact that the computer lesson was at least as demanding as a traditional lesson, if not more so. The level of learning was considered high and optimised by the use of self-checking questions that were considered an essential element. However no student agreed to sit an official examination, even interactively, using the computer

  6. Computing and data handling requirements for SSC [Superconducting Super Collider] and LHC [Large Hadron Collider] experiments

    International Nuclear Information System (INIS)

    Lankford, A.J.

    1990-05-01

    A number of issues for computing and data handling in the online in environment at future high-luminosity, high-energy colliders, such as the Superconducting Super Collider (SSC) and Large Hadron Collider (LHC), are outlined. Requirements for trigger processing, data acquisition, and online processing are discussed. Some aspects of possible solutions are sketched. 6 refs., 3 figs

  7. Computer-Based Legal Education at the University of Illinois: A Report of Two Years' Experience

    Science.gov (United States)

    Maggs, Peter B.; Morgan, Thomas D.

    1975-01-01

    Describes experimentation with the Plato IV computer-assisted method of teaching law at the University of Illinois College of Law: development and testing of programs for teaching Future Interests and Offer and Acceptance, and law-related work currently being done on Plato. Potential, limitations, and student enthusiasm are summarized. (JT)

  8. Computing Camps for Girls : A First-Time Experience at the University of Limerick

    NARCIS (Netherlands)

    McInerney, Clare; Lamprecht, A.L.; Margaria, Tiziana

    2018-01-01

    Increasing the number of females in ICT-related university courses has been a major concern for several years. In 2015, we offered a girls-only computing summer camp for the first time, as a new component in our education and outreach activities to foster students’ interest in our discipline. In

  9. COMPUTING EXPERIMENT BY DEFINITION OF AERODYNAMIC CHARACTERISTICS OF A CYLINDRICAL BEAM WITH A STRAKE

    Directory of Open Access Journals (Sweden)

    V. A. Ivchin

    2014-01-01

    Full Text Available On the basis of methods of computing aerodynamics (CFD research by definition of aerodynamic effects from the strake on a tail boom of cylindrical section is conducted and the estimation of the constants and variable forces operational depending on the strake angular rule.

  10. Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges

    Science.gov (United States)

    Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil

    2017-01-01

    The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…

  11. Effects of Pictures, Age, and Experience on Learning To Use a Computer Program.

    Science.gov (United States)

    Van Der Meij, Hans; Gellevij, Mark

    2002-01-01

    Examines the effects of pictures (screen captures and input devices) on documentation for older novice computer users (aged 26 to 69 years). Argues that screen captures help reduce memory load and support visual scanning for older users. Finds no support for the prediction that the presence of pictures would make some manuals more effective than…

  12. Cloud Computing Technologies in Writing Class: Factors Influencing Students' Learning Experience

    Science.gov (United States)

    Wang, Jenny

    2017-01-01

    The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS) system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study…

  13. Physical Computing for STEAM Education: Maker-Educators' Experiences in an Online Graduate Course

    Science.gov (United States)

    Hsu, Yu-Chang; Ching, Yu-Hui; Baldwin, Sally

    2018-01-01

    This research explored how K-16 educators learned physical computing, and developed as maker-educators in an online graduate course. With peer support and instructor guidance, these educators designed maker projects using Scratch and Makey Makey, and developed educational maker proposals with plans of teaching the topics of their choice in STEAM…

  14. Do aggressive people play violent computer games in a more aggressive way? Individual difference and idiosyncratic game-playing experience.

    Science.gov (United States)

    Peng, Wei; Liu, Ming; Mou, Yi

    2008-04-01

    ABSTRACT This study investigates whether individual difference influences idiosyncratic experience of game playing. In particular, we examine the relationship between the game player's physical-aggressive personality and the aggressiveness of the player's game playing in violence-oriented video games. Screen video stream of 40 individual participants' game playing was captured and content analyzed. Participants' physical aggression was measured before the game play. The results suggest that people with more physical-aggressive personality engage in a more aggressive style of playing, after controlling the differences of gender and previous gaming experience. Implications of these findings and direction for future studies are discussed.

  15. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    Science.gov (United States)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  16. Computer control of the titanium getter system on the tandem mirror experiment-upgrade (TMX-U)

    International Nuclear Information System (INIS)

    McAlice, A.J.; Bork, R.G.; Clower, C.A.; Moore, T.L.; Lang, D.D.; Pico, R.E.

    1983-01-01

    Gettering has been a standard technique for achieving high-quality vacuum in fusion experiments for some time. On Lawrence Livermore National Laboratory's Tandem Mirror Experiment (TMX-U), an extensive gettering system is utilized with liquid-nitrogen-cooled panels to provide the fast pumping during each physics experiment. The getter wires are a 85% titanium and 15% tantalum alloy directly heated by an electrical current. TMX-U has 162 getter power-supply channels; each channel supplies approximately 106 A of regulated power to each getter for a 60-s cycle. In the vacuum vessel, the getter wires are organized into poles or arrays. On each pole there are six getter wires, each cables to the exterior of the vessel. This arrangement allows the power supplies to be switched from getter wire to getter wire as the individual wires deteriorate after 200 to 300 gettering cycles. To control the getter power suppiles, we will install a computer system to operate the system and document the performance of each getter circuit. This computer system will control the 162 power supplies via a Computer Automated Measurement and Control (CAMAC) architecture with a fiber-optic serial highway. Getter wire history will be stored on the built-in 10 megabyte disc drive with new entries backed up daily on a floppy disc. Overall, this system will allow positive tracking of getter wire condition, document the total gettering performance, and predict getter maintenance/changeover cycles. How we will employ the computer system to enhance the getter system is the subject of this paper

  17. The Mailbox Computer System for the IAEA verification experiment on HEU downblending at the Portsmouth Gaseous Diffusion Plant

    International Nuclear Information System (INIS)

    Aronson, A.L.; Gordon, D.M.

    2000-01-01

    IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BY THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA

  18. Interaction between computational modelling and experiments for vacuum consumable arc remelting

    International Nuclear Information System (INIS)

    Bertram, L.A.; Zanner, F.J.

    1980-01-01

    A combined computational-experimental modelling effort is currently underway to characterize the vacuum consumable arc remelt process. This effort involves the coupling of experimental results with a magnetohydrodynamic flow model which is capable of time accurate solutions of the interdependent fluid flow-solidification process in the ingot. Models such as this are driven by boundary conditions. Considerable data have been compiled from direct observation of the electrode tip and molten pool surface by means of high speed photography in order to gain an understanding of the processes at the pool surface and the appropriate corresponding boundary conditions. The crucible wall/molten metal miniscus conditions are less well understood. Pool volumes are computed at different melting currents and show reasonable agreement with experimentally determined values. Current flow through the ingot is evaluated numerically and the results indicate that a significant portion of the melt current does not reach the interior of the ingot. U-6 wt. % Nb alloy was used

  19. Computational organic chemistry: bridging theory and experiment in establishing the mechanisms of chemical reactions.

    Science.gov (United States)

    Cheng, Gui-Juan; Zhang, Xinhao; Chung, Lung Wa; Xu, Liping; Wu, Yun-Dong

    2015-02-11

    Understanding the mechanisms of chemical reactions, especially catalysis, has been an important and active area of computational organic chemistry, and close collaborations between experimentalists and theorists represent a growing trend. This Perspective provides examples of such productive collaborations. The understanding of various reaction mechanisms and the insight gained from these studies are emphasized. The applications of various experimental techniques in elucidation of reaction details as well as the development of various computational techniques to meet the demand of emerging synthetic methods, e.g., C-H activation, organocatalysis, and single electron transfer, are presented along with some conventional developments of mechanistic aspects. Examples of applications are selected to demonstrate the advantages and limitations of these techniques. Some challenges in the mechanistic studies and predictions of reactions are also analyzed.

  20. A Randomized Exchange Algorithm for Computing Optimal Approximate Designs of Experiments

    KAUST Repository

    Harman, Radoslav

    2018-01-17

    We propose a class of subspace ascent methods for computing optimal approximate designs that covers both existing as well as new and more efficient algorithms. Within this class of methods, we construct a simple, randomized exchange algorithm (REX). Numerical comparisons suggest that the performance of REX is comparable or superior to the performance of state-of-the-art methods across a broad range of problem structures and sizes. We focus on the most commonly used criterion of D-optimality that also has applications beyond experimental design, such as the construction of the minimum volume ellipsoid containing a given set of data-points. For D-optimality, we prove that the proposed algorithm converges to the optimum. We also provide formulas for the optimal exchange of weights in the case of the criterion of A-optimality. These formulas enable one to use REX for computing A-optimal and I-optimal designs.

  1. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  2. Coherent structures in granular crystals from experiment and modelling to computation and mathematical analysis

    CERN Document Server

    Chong, Christopher

    2018-01-01

    This book summarizes a number of fundamental developments at the interface of granular crystals and the mathematical and computational analysis of some of their key localized nonlinear wave solutions. The subject presents a blend of the appeal of granular crystals as a prototypical engineering tested for a variety of diverse applications, the novelty in the nonlinear physics of its coherent structures, and the tractability of a series of mathematical and computational techniques to analyse them. While the focus is on principal one-dimensional solutions such as shock waves, traveling waves, and discrete breathers, numerous extensions of the discussed patterns, e.g., in two dimensions, chains with defects, heterogeneous settings, and other recent developments are discussed. The book appeals to researchers in the field, as well as for graduate and advanced undergraduate students. It will be of interest to mathematicians, physicists and engineers alike.

  3. Computer vision-based automatic beverage dispenser prototype for user experience studies

    OpenAIRE

    Merchán, Fernando; Valderrama, Elba; Poveda, Martín

    2017-01-01

    This paper presents several aspects of the implementation of a prototype of automatic beverage dispenser with computer vision functionalities. The system presents touchless technologies including face recognition for user identification and hand gesture recognition for beverage selection. This prototype is a test platform to explore the acceptance of these technologies by consumers and to compare it with other technologies such as touch screens. We present both the technical aspects of the de...

  4. Hypersonic ramjet experiment project. Phase 1: Computer program description, ramjet and scramjet cycle performance

    Science.gov (United States)

    Jackson, R. J.; Wang, T. T.

    1974-01-01

    A computer program was developed to describe the performance of ramjet and scramjet cycles. The program performs one dimensional calculations of the equilibrium, real-gas internal flow properties of the engine. The program can be used for the following: (1) preliminary design calculation and (2) design analysis of internal flow properties corresponding to stipulated flow areas. Only the combustion of hydrogen in air is considered in this case.

  5. Fan interaction noise reduction using a wake generator: experiments and computational aeroacoustics

    Science.gov (United States)

    Polacsek, C.; Desbois-Lavergne, F.

    2003-08-01

    A control grid (wake generator) aimed at reducing rotor-stator interaction modes in fan engines when mounted upstream of the rotor has been studied here. This device complements other active noise control systems currently proposed. The compressor model of the instrumented ONERA CERF-rig is used to simulate suitable conditions. The design of the grid is drafted out using semi-empirical models for wake and potential flow, and experimentally achieved. Cylindrical rods are able to generate a spinning mode of the same order and similar level as the interaction mode. Mounting the rods on a rotating ring allows for adjusting the phase of the control mode so that an 8 dB sound pressure level (SPL) reduction at the blade passing frequency is achieved when the two modes are out of phase. Experimental results are assessed by a numerical approach using computational fluid dynamics (CFD). A Reynolds averaged Navier-Stokes 2-D solver, developed at ONERA, is used to provide the unsteady force components on blades and vanes required for acoustics. The loading noise source term of the Ffowcs Williams and Hawkings equation is used to model the interaction noise between the sources, and an original coupling to a boundary element method (BEM) code is realized to take account of the inlet geometry effects on acoustic in-duct propagation. Calculations using the classical analytical the Green function of an infinite annular duct are also addressed. Simple formulations written in the frequency domain and expanded into modes are addressed and used to compute an in-duct interaction mode and to compare with the noise reduction obtained during the tests. A fairly good agreement between predicted and measured SPL is found when the inlet geometry effects are part of the solution (by coupling with the BEM). Furthermore, computed aerodynamic penalties due to the rods are found to be negligible. These results partly validate the computation chain and highlight the potential of the wake generator

  6. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    Science.gov (United States)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  7. Computer-aided navigation in dental implantology: 7 years of clinical experience.

    Science.gov (United States)

    Ewers, Rolf; Schicho, Kurt; Truppe, Michael; Seemann, Rudolf; Reichwein, Astrid; Figl, Michael; Wagner, Arne

    2004-03-01

    This long-term study gives a review over 7 years of research, development, and routine clinical application of computer-aided navigation technology in dental implantology. Benefits and disadvantages of up-to-date technologies are discussed. In the course of the current advancement, various hardware and software configurations are used. In the initial phase, universally applicable navigation software is adapted for implantology. Since 2001, a special software module for dental implantology is available. Preoperative planning is performed on the basis of prosthetic aspects and requirements. In clinical routine use, patient and drill positions are intraoperatively registered by means of optoelectronic tracking systems; during preclinical tests, electromagnetic trackers are also used. In 7 years (1995 to 2002), 55 patients with 327 dental implants were successfully positioned with computer-aided navigation technology. The mean number of implants per patient was 6 (minimum, 1; maximum, 11). No complications were observed; the preoperative planning could be exactly realized. The average expenditure of time for the preparation of a surgical intervention with navigation decreased from 2 to 3 days in the initial phase to one-half day in clinical routine use with software that is optimized for dental implantology. The use of computer-aided navigation technology can contribute to considerable quality improvement. Preoperative planning is exactly realized and intraoperative safety is increased, because damage to nerves or neighboring teeth can be avoided.

  8. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  9. CNRA/CSNI workshop on licensing and operating experience of computer-based I and C systems - Summary and conclusions

    International Nuclear Information System (INIS)

    2002-01-01

    The OECD Workshop on Licensing and Operating Experience of Computer-Based I and C Systems, was sponsored by both the Committee on Nuclear Regulatory Activities (CNRA) and the Committee on the Safety of Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organised in collaboration with the Czech State Office for Nuclear Safety (SUJB), the Czech Power Board CEZ a.s., I and C Energo a.s. and the Nuclear Research Institute, Rez near Prague. The objectives of the Workshop were to exchange the experience gained by both the regulators and the industry in different countries in the licensing and operation of computer-based I and C systems, to discuss the existing differences in their licensing approaches in various countries, to consider the safety aspects of their practical use, and to discuss the ways of promoting future international co-operation in the given area. The scope of the Workshop included: - review of the progress made since the CNRA/CSNI workshop which was held in 1996 - current and future regulatory needs and/or requirements for the computer-based I and C systems - progress made in software life cycle activities, including verification and validation, and safety/hazards analysis - benefits of applying the computer-based I and C systems to improve plant performance and safety. The Technical Sessions and Discussion Sessions covered the following topics: Opening Session: Advances made in the use and planning of computer-based I and C systems; Topic 1: National and international standards and guides for computer-based safety systems; Topic 2: Regulatory aspects; Topic 3: Analysis and assessment of digital I and C systems; Topic 4: Software life cycle activities; Topic 4: Experience with applications, system aspects, potential limits and future trends and needs; Final Session: Workshop summary. The workshop provided a unique opportunity for people with experience in licensing, developing, manufacturing, implementing, maintaining or

  10. Dual-mode computer processing for high resolution DNA thermal denaturation experiments.

    Science.gov (United States)

    Ansevin, A T; Vizard, D L

    1984-02-01

    Two modes of data processing are appropriate in conducting high resolution thermal denaturation experiments (thermal increments of 0.05 degrees or closer). In the first mode, a general purpose microcomputer provides on-line services important to the control and monitoring of the initial experiment, including control of the spectrophotometer and heater, the recording of data, and the display of current hyperchromicities and approximate first derivatives. A subsequent microcomputer program then reads the recorded data files and carries out accurate calculations of derivative denaturation profiles and the estimate of the statistical error of the first derivative at each point. The data collection program handles three samples at a time and was designed to provide optimal results in thermal denaturation experiments with a single-beam spectrophotometer.

  11. Computational simulation of natural circulation and rewetting experiments using the TRAC/PF1 code

    International Nuclear Information System (INIS)

    Silva, J.D. da.

    1994-05-01

    In this work the TRAC code was used to simulate experiments of natural circulation performed in the first Brazilian integral test facility at (COPESP), Sao Paulo and a rewetting experiment in a single tube test section carried out at CDTN, Belo Horizonte, Brazil. In the first simulation the loop behavior in two transient conditions with different thermal power, namely 20 k W and 120 k W, was verified in the second one the quench front propagation, the liquid mass collected in the carry over measuring tube and the wall temperature at different elevations during the flooding experiment was measured. A comparative analysis, for code consistency, shows a good agreement between the code results and experimental data, except for the quench from velocity. (author). 15 refs, 19 figs, 12 tabs

  12. ABWR (K-6/7) construction experience (computer-based safety system)

    International Nuclear Information System (INIS)

    Yokomura, T.

    1998-01-01

    TEPCO applied a digital safety system to Kashiwazaki-Kariwa Nuclear Power Station Unit Nos. 6 and 7, the world's first ABWR plant. Although this was the first time to apply a digital safety logic system in Japan, we were able to complete construction of K-6/7 very successfully and without any delay. TEPCO took a approach of developing a substantial amount of experience in digital non- safety systems before undertaking the design of the safety protection system. This paper describes the history, techniques and experience behind achieving a highly reliable digital safety system. (author)

  13. Handling Worldwide LHC Computing Grid Critical Service Incidents : The infrastructure and experience behind nearly 5 years of GGUS ALARMs

    CERN Multimedia

    Dimou, M; Dulov, O; Grein, G

    2013-01-01

    In the Wordwide LHC Computing Grid (WLCG) project the Tier centres are of paramount importance for storing and accessing experiment data and for running the batch jobs necessary for experiment production activities. Although Tier2 sites provide a significant fraction of the resources a non-availability of resources at the Tier0 or the Tier1s can seriously harm not only WLCG Operations but also the experiments' workflow and the storage of LHC data which are very expensive to reproduce. This is why availability requirements for these sites are high and committed in the WLCG Memorandum of Understanding (MoU). In this talk we describe the workflow of GGUS ALARMs, the only 24/7 mechanism available to LHC experiment experts for reporting to the Tier0 or the Tier1s problems with their Critical Services. Conclusions and experience gained from the detailed drills performed in each such ALARM for the last 4 years are explained and the shift with time of Type of Problems met. The physical infrastructure put in place to ...

  14. Methods of computer experiment in gamma-radiation technologies using new radiation sources

    CERN Document Server

    Bratchenko, M I; Rozhkov, V V

    2001-01-01

    Presented id the methodology of computer modeling application for physical substantiation of new irradiation technologies and irradiators design work flow. Modeling tasks for irradiation technologies are structured along with computerized methods of their solution and appropriate types of software. Comparative analysis of available packages for Monte-Carlo modeling of electromagnetic processes in media is done concerning their application to irradiation technologies problems. The results of codes approbation and preliminary data on gamma-radiation absorbed dose distributions for nuclides of conventional sources and prospective Europium-based gamma-sources are presented.

  15. Using Application-Domain Knowledge in the Runtime Support of Multi-Experiment Computational Studies

    Science.gov (United States)

    2009-01-01

    optimization. Technical Report LCC 2004/02, Departamento de Lenguajes y Ciencias de la Computacin. Universidad de Mlaga, March 2004. [58] R.S. Macleod...Across the fields of sci- ence, engineering, and art , the computer user often needs to execute the same software multiple times and only obtains...li+1 + ρgpyi ) + N−1∑ i=0 (li/h− 1)2 +Kc(pyr0) 2 +Kc(p y r1 )2 where li = |pi+1 − pi|, and θi is the angle between segments (pi−1, pi) and (pi, pi+1

  16. Experience with post-mortem computed tomography in Southern Denmark 2006-11

    DEFF Research Database (Denmark)

    Leth, Peter Mygind

    2013-01-01

    Objectives: (1) To explore the ability of post-mortem computed tomography (PMCT) to establish the cause of death. (2) To investigate the inter-method variation between autopsy and PMCT. (3) To investigate whether PMCT can select cases for autopsy. (4) To investigate the importance of histology....... Materials: PMCT and autopsy were performed in 900 forensic cases from Southern Denmark. 4547 diagnoses were registered. Methods: This was a prospective, double-blind investigation. Results: In twothirds of all cases, PMCT and autopsy agreed on the cause of death. The agreement was highest for injury deaths...

  17. Using Experiment and Computer Modeling to Determine the Off-Axis Magnetic Field of a Solenoid

    Science.gov (United States)

    Lietor-Santos, Juan Jose

    2014-01-01

    The study of the ideal solenoid is a common topic among introductory-based physics textbooks and a typical current arrangement in laboratory hands-on experiences where the magnetic field inside a solenoid is determined at different currents and at different distances from its center using a magnetic probe. It additionally provides a very simple…

  18. Methods for building an inexpensive computer-controlled olfactometer for temporally-precise experiments

    NARCIS (Netherlands)

    Lundstrom, J.N.; Gordon, A.; Alden, E.C.; Boesveldt, S.; Albrecht, J.

    2010-01-01

    Many human olfactory experiments call for fast and stable stimulus-rise times as well as exact and stable stimulus-onset times. Due to these temporal demands, an olfactometer is often needed. However, an olfactometer is a piece of equipment that either comes with a high price tag or requires a high

  19. A computer-assisted experiment in single-slit diffraction and spatial filtering

    Science.gov (United States)

    Bennett, C. A.

    1990-01-01

    An intermediate-level experiment examines Fraunhofer diffraction and its implications for the image formation of a coherently illuminated object. A commercially available software package is used to provide an environment that allows interactive data acquisition and subsequent numerical and graphical analysis.

  20. Introduction to focus issue: Mixed mode oscillations: Experiment, computation, and analysis

    DEFF Research Database (Denmark)

    Brøns, Morten; Kaper, T.J.; Rotstein, H.G.

    2008-01-01

    Mixed mode oscillations ( MMOs ) occur when a dynamical system switches between fast and slow motion and small and large amplitude. MMOs appear in a variety of systems in nature, and may be simple or complex. This focus issue presents a series of articles on theoretical, numerical, and experiment...

  1. Making computers noble. An experiment in automatic analysis of medieval texts

    Directory of Open Access Journals (Sweden)

    Andrea Colli

    2016-02-01

    Full Text Available L’analisi informatica di testi filosofici, la creazione di database, ipertesti o edizioni elettroniche non costituiscono più unicamente una ricerca di frontiera, ma sono da molti anni una risorsa preziosa per gli studi umanistici. Ora, non si tratta di richiedere alle macchine un ulteriore sforzo per comprendere il linguaggio umano, quanto piuttosto di perfezionare gli strumenti affinché esse possano essere a tutti gli effetti collaboratori di ricerca. Questo articolo è concepito come il resoconto di un esperimento finalizzato a documentare come le associazioni lessicali di un gruppo selezionato di testi medievali possa offrire qualche suggerimento in merito ai loro contenuti teorici. Computer analysis of texts, creation of databases hypertexts and digital editions are not the final frontier of research anymore. Quite the contrary, from many years they have been representing a significant contribution to medieval studies. Therefore, we do not mean to make the computer able to grasp the meaning of human language and penetrate its secrets, but rather we aim at improving their tools, so that they will become an even more efficient equipment employed in research activities. This paper is thought as a sort of technical report with the proposed task to verify if an automatic identification of some word associations within a selected groups of medieval writings produces suggestions on the subject of the processed texts, able to be used in a theoretical inquiry.

  2. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention.

    Science.gov (United States)

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2015-09-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald and Garland, Psycholog Assess 25:146-156, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on "joining," which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached 0.83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings.

  3. A Step Towards A Computing Grid For The LHC Experiments ATLAS Data Challenge 1

    CERN Document Server

    Sturrock, R; Epp, B; Ghete, V M; Kuhn, D; Mello, A G; Caron, B; Vetterli, M C; Karapetian, G V; Martens, K; Agarwal, A; Poffenberger, P R; McPherson, R A; Sobie, R J; Amstrong, S; Benekos, N C; Boisvert, V; Boonekamp, M; Brandt, S; Casado, M P; Elsing, M; Gianotti, F; Goossens, L; Grote, M; Hansen, J B; Mair, K; Nairz, A; Padilla, C; Poppleton, A; Poulard, G; Richter-Was, Elzbieta; Rosati, S; Schörner-Sadenius, T; Wengler, T; Xu, G F; Ping, J L; Chudoba, J; Kosina, J; Lokajícek, M; Svec, J; Tas, P; Hansen, J R; Lytken, E; Nielsen, J L; Wäänänen, A; Tapprogge, Stefan; Calvet, D; Albrand, S; Collot, J; Fulachier, J; Ledroit-Guillon, F; Ohlsson-Malek, F; Viret, S; Wielers, M; Bernardet, K; Corréard, S; Rozanov, A; De Vivie de Régie, J B; Arnault, C; Bourdarios, C; Hrivnác, J; Lechowski, M; Parrour, G; Perus, A; Rousseau, D; Schaffer, A; Unal, G; Derue, F; Chevalier, L; Hassani, S; Laporte, J F; Nicolaidou, R; Pomarède, D; Virchaux, M; Nesvadba, N; Baranov, S; Putzer, A; Khonich, A; Duckeck, G; Schieferdecker, P; Kiryunin, A E; Schieck, J; Lagouri, T; Duchovni, E; Levinson, L; Schrager, D; Negri, G; Bilokon, H; Spogli, L; Barberis, D; Parodi, F; Cataldi, G; Gorini, E; Primavera, M; Spagnolo, S; Cavalli, D; Heldmann, M; Lari, T; Perini, L; Rebatto, D; Resconi, S; Tatarelli, F; Vaccarossa, L; Biglietti, M; Carlino, G; Conventi, F; Doria, A; Merola, L; Polesello, G; Vercesi, V; De Salvo, A; Di Mattia, A; Luminari, L; Nisati, A; Reale, M; Testa, M; Farilla, A; Verducci, M; Cobal, M; Santi, L; Hasegawa, Y; Ishino, M; Mashimo, T; Matsumoto, H; Sakamoto, H; Tanaka, J; Ueda, I; Bentvelsen, Stanislaus Cornelius Maria; Fornaini, A; Gorfine, G; Groep, D; Templon, J; Köster, L J; Konstantinov, A; Myklebust, T; Ould-Saada, F; Bold, T; Kaczmarska, A; Malecki, P; Szymocha, T; Turala, M; Kulchitskii, Yu A; Khoreauli, G; Gromova, N; Tsulaia, V; Minaenko, A A; Rudenko, R; Slabospitskaya, E; Solodkov, A; Gavrilenko, I; Nikitine, N; Sivoklokov, S Yu; Toms, K; Zalite, A; Zalite, Yu; Kervesan, B; Bosman, M; González, S; Sánchez, J; Salt, J; Andersson, N; Nixon, L; Eerola, Paule Anna Mari; Kónya, B; Smirnova, O G; Sandgren, A; Ekelöf, T J C; Ellert, M; Gollub, N; Hellman, S; Lipniacka, A; Corso-Radu, A; Pérez-Réale, V; Lee, S C; CLin, S C; Ren, Z L; Teng, P K; Faulkner, P J W; O'Neale, S W; Watson, A; Brochu, F; Lester, C; Thompson, S; Kennedy, J; Bouhova-Thacker, E; Henderson, R; Jones, R; Kartvelishvili, V G; Smizanska, M; Washbrook, A J; Drohan, J; Konstantinidis, N P; Moyse, E; Salih, S; Loken, J; Baines, J T M; Candlin, D; Candlin, R; Clifft, R; Li, W; McCubbin, N A; George, S; Lowe, A; Buttar, C; Dawson, I; Moraes, A; Tovey, Daniel R; Gieraltowski, J; Malon, D; May, E; LeCompte, T J; Vaniachine, A; Adams, D L; Assamagan, Ketevi A; Baker, R; Deng, W; Fine, V; Fisyak, Yu; Gibbard, B; Ma, H; Nevski, P; Paige, F; Rajagopalan, S; Smith, J; Undrus, A; Wenaus, T; Yu, D; Calafiura, P; Canon, S; Costanzo, D; Hinchliffe, Ian; Lavrijsen, W; Leggett, C; Marino, M; Quarrie, D R; Sakrejda, I; Stravopoulos, G; Tull, C; Loch, P; Youssef, S; Shank, J T; Engh, D; Frank, E; Sen-Gupta, A; Gardner, R; Meritt, F; Smirnov, Y; Huth, J; Grundhoefer, L; Luehring, F C; Goldfarb, S; Severini, H; Skubic, P L; Gao, Y; Ryan, T; De, K; Sosebee, M; McGuigan, P; Ozturk, N

    2004-01-01

    The ATLAS Collaboration at CERN is preparing for the data taking and analysis at the LHC that will start in 2007. Therefore, a series of Data Challenges was started in 2002 whose goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made for the final offline computing environment. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples as a worldwide distributed activity. It should be noted that it was not an option to "run the complete production at CERN" even if we had wanted to; the resources were not available at CERN to carry out the production on a reasonable time-scale. The great challenge of organising and carrying out this large-scale production at a significant number of sites around the world had therefore to be faced. However, the benefits of this are manifold: apart from realising the require...

  4. Combining computational models, semantic annotations and simulation experiments in a graph database

    Science.gov (United States)

    Henkel, Ron; Wolkenhauer, Olaf; Waltemath, Dagmar

    2015-01-01

    Model repositories such as the BioModels Database, the CellML Model Repository or JWS Online are frequently accessed to retrieve computational models of biological systems. However, their storage concepts support only restricted types of queries and not all data inside the repositories can be retrieved. In this article we present a storage concept that meets this challenge. It grounds on a graph database, reflects the models’ structure, incorporates semantic annotations and simulation descriptions and ultimately connects different types of model-related data. The connections between heterogeneous model-related data and bio-ontologies enable efficient search via biological facts and grant access to new model features. The introduced concept notably improves the access of computational models and associated simulations in a model repository. This has positive effects on tasks such as model search, retrieval, ranking, matching and filtering. Furthermore, our work for the first time enables CellML- and Systems Biology Markup Language-encoded models to be effectively maintained in one database. We show how these models can be linked via annotations and queried. Database URL: https://sems.uni-rostock.de/projects/masymos/ PMID:25754863

  5. The journey of a sandwich: computer-based laboratory experiments about the human digestive system in high school biology teaching.

    Science.gov (United States)

    Sorgo, Andrej; Hajdinjak, Zdravka; Briski, Darko

    2008-03-01

    Teaching high school students about the digestive system can be a challenge for a teacher when s/he wants to overcome rote learning of facts without a deeper understanding of the physiological processes inside the alimentary tract. A series of model experiments illustrating the journey of a sandwich was introduced into teaching high school biology. Using a computer equipped with a commercially available data-acquisition system and a couple of sensors, it was possible to illustrate the basic underlying physical and chemical principles of digestion to the students. Students were able to investigate, through hands-on activities, the chewing force of the jaws, importance of the mechanical breakdown of food, enzymatic activity of pepsin and amylase, antibacterial activity of hydrochloric acid, and importance of the villi for absorption. Students found the experiments interesting and helpful for understanding the digestive process. Furthermore, the results from testing indicated that the students had a deeper understanding of the physiological processes.

  6. Using High-Fidelity Computational Fluid Dynamics to Help Design a Wind Turbine Wake Measurement Experiment

    International Nuclear Information System (INIS)

    Churchfield, M; Wang, Q; Scholbrock, A; Herges, T; Mikkelsen, T; Sjöholm, M

    2016-01-01

    We describe the process of using large-eddy simulations of wind turbine wake flow to help design a wake measurement campaign. The main goal of the experiment is to measure wakes and wake deflection that result from intentional yaw misalignment under a variety of atmospheric conditions at the Scaled Wind Farm Technology facility operated by Sandia National Laboratories in Lubbock, Texas. Prior simulation studies have shown that wake deflection may be used for wind-plant control that maximizes plant power output. In this study, simulations are performed to characterize wake deflection and general behavior before the experiment is performed to ensure better upfront planning. Beyond characterizing the expected wake behavior, we also use the large-eddy simulation to test a virtual version of the lidar we plan to use to measure the wake and better understand our lidar scan strategy options. This work is an excellent example of a “simulation-in-the-loop” measurement campaign. (paper)

  7. Planar hydrodynamic instability computations and experiments with rugby-shaped hohlraums at the Omega laser

    International Nuclear Information System (INIS)

    Vandenboomgaerde, M; Liberatore, S; Galmiche, D; Casner, A; Huser, G; Jadaud, J P; Villette, B

    2008-01-01

    Implosion of inertial confinement fusion (ICF) capsule is very sensitive to the growth of sphericity perturbations. The control of the feeding of such perturbations and their transport ('feedthrough') through the ablator is a key point to reach ignition. Since 2002, experiments have been designed and performed on the Omega laser facility in order to study these phenomena in planar geometry. A new 'rugby shaped' hohlraum was used. We present experimental results and comparisons with numerical simulations

  8. Planar hydrodynamic instability computations and experiments with rugby-shaped hohlraums at the Omega laser

    Science.gov (United States)

    Vandenboomgaerde, M.; Liberatore, S.; Galmiche, D.; Casner, A.; Huser, G.; Jadaud, J. P.; Villette, B.

    2008-05-01

    Implosion of inertial confinement fusion (ICF) capsule is very sensitive to the growth of sphericity perturbations. The control of the feeding of such perturbations and their transport ('feedthrough') through the ablator is a key point to reach ignition. Since 2002 [1, 2], experiments have been designed and performed on the Omega laser facility in order to study these phenomena in planar geometry. A new 'rugby shaped' hohlraum was used [3, 4]. We present experimental results and comparisons with numerical simulations.

  9. MarsSedEx III: linking Computational Fluid Dynamics (CFD) and reduced gravity experiments

    Science.gov (United States)

    Kuhn, N. J.; Kuhn, B.; Gartmann, A.

    2015-12-01

    Nikolaus J. Kuhn (1), Brigitte Kuhn (1), and Andres Gartmann (2) (1) University of Basel, Physical Geography, Environmental Sciences, Basel, Switzerland (nikolaus.kuhn@unibas.ch), (2) Meteorology, Climatology, Remote Sensing, Environmental Sciences, University of Basel, Switzerland Experiments conducted during the MarsSedEx I and II reduced gravity experiments showed that using empirical models for sediment transport on Mars developed for Earth violates fluid dynamics. The error is caused by the interaction between runing water and sediment particles, which affect each other in a positive feedback loop. As a consequence, the actual flow conditions around a particle cannot be represented by drag coefficients derived on Earth. This study exmines the implications of such gravity effects on sediment movement on Mars, with special emphasis on the limits of sandstones and conglomerates formed on Earth as analogues for sedimentation on Mars. Furthermore, options for correctiong the errors using a combination of CFD and recent experiments conducted during the MarsSedEx III campaign are presented.

  10. Introduction to Focus Issue: Rhythms and Dynamic Transitions in Neurological Disease: Modeling, Computation, and Experiment

    International Nuclear Information System (INIS)

    Kaper, Tasso J.; Kramer, Mark A.; Rotstein, Horacio G.

    2013-01-01

    Rhythmic neuronal oscillations across a broad range of frequencies, as well as spatiotemporal phenomena, such as waves and bumps, have been observed in various areas of the brain and proposed as critical to brain function. While there is a long and distinguished history of studying rhythms in nerve cells and neuronal networks in healthy organisms, the association and analysis of rhythms to diseases are more recent developments. Indeed, it is now thought that certain aspects of diseases of the nervous system, such as epilepsy, schizophrenia, Parkinson's, and sleep disorders, are associated with transitions or disruptions of neurological rhythms. This focus issue brings together articles presenting modeling, computational, analytical, and experimental perspectives about rhythms and dynamic transitions between them that are associated to various diseases

  11. Automatic data-acquisition and communications computer network for fusion experiments

    International Nuclear Information System (INIS)

    Kemper, C.O.

    1981-01-01

    A network of more than twenty computers serves the data acquisition, archiving, and analysis requirements of the ISX, EBT, and beam-line test facilities at the Fusion Division of Oak Ridge National Laboratory. The network includes PDP-8, PDP-12, PDP-11, PDP-10, and Interdata 8-32 processors, and is unified by a variety of high-speed serial and parallel communications channels. While some processors are dedicated to experimental data acquisition, and others are dedicated to later analysis and theoretical work, many processors perform a combination of acquisition, real-time analysis and display, and archiving and communications functions. A network software system has been developed which runs in each processor and automatically transports data files from point of acquisition to point or points of analysis, display, and storage, providing conversion and formatting functions are required

  12. Introduction to Focus Issue: Rhythms and Dynamic Transitions in Neurological Disease: Modeling, Computation, and Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kaper, Tasso J., E-mail: tasso@bu.edu; Kramer, Mark A., E-mail: mak@bu.edu [Department of Mathematics and Statistics, Boston University, Boston, Massachusetts 02215 (United States); Rotstein, Horacio G., E-mail: horacio@njit.edu [Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, New Jersey 07102 (United States)

    2013-12-15

    Rhythmic neuronal oscillations across a broad range of frequencies, as well as spatiotemporal phenomena, such as waves and bumps, have been observed in various areas of the brain and proposed as critical to brain function. While there is a long and distinguished history of studying rhythms in nerve cells and neuronal networks in healthy organisms, the association and analysis of rhythms to diseases are more recent developments. Indeed, it is now thought that certain aspects of diseases of the nervous system, such as epilepsy, schizophrenia, Parkinson's, and sleep disorders, are associated with transitions or disruptions of neurological rhythms. This focus issue brings together articles presenting modeling, computational, analytical, and experimental perspectives about rhythms and dynamic transitions between them that are associated to various diseases.

  13. Introduction to Focus Issue: Rhythms and Dynamic Transitions in Neurological Disease: Modeling, Computation, and Experiment

    Science.gov (United States)

    Kaper, Tasso J.; Kramer, Mark A.; Rotstein, Horacio G.

    2013-12-01

    Rhythmic neuronal oscillations across a broad range of frequencies, as well as spatiotemporal phenomena, such as waves and bumps, have been observed in various areas of the brain and proposed as critical to brain function. While there is a long and distinguished history of studying rhythms in nerve cells and neuronal networks in healthy organisms, the association and analysis of rhythms to diseases are more recent developments. Indeed, it is now thought that certain aspects of diseases of the nervous system, such as epilepsy, schizophrenia, Parkinson's, and sleep disorders, are associated with transitions or disruptions of neurological rhythms. This focus issue brings together articles presenting modeling, computational, analytical, and experimental perspectives about rhythms and dynamic transitions between them that are associated to various diseases.

  14. Operating experience with LEAP from the perspective of the computing applications analyst

    Energy Technology Data Exchange (ETDEWEB)

    Ford, W.E. III; Horwedel, J.E.; McAdoo, J.W.; Alsmiller, R.G. Jr.; Toney, B.C.

    1981-05-01

    The Long-Term Energy Analysis Program (LEAP), which was used for the energy price-quantity projections in the 1978 Annual Report to Congress (ARC '78) and used in an ORNL research program to develop and demonstrate a procedure for evaluating energy-economic modeling computer codes and the important results derived therefrom, is discussed. The LEAP system used in the ORNL research, the mechanics of executing LEAP, and the personnel skills required to execute the system are described. In addition, a LEAP sample problem, subroutine hierarchical flowcharts, and input tables for the ARC '78 energy-economic model are included. Results of a study to test the capability of the LEAP system used in the ORNL research to reproduce the ARC '78 results credited to LEAP are presented.

  15. Abdominal computed tomography during pregnancy for suspected appendicitis: a 5-year experience at a maternity hospital.

    Science.gov (United States)

    Shetty, Mahesh K; Garrett, Nan M; Carpenter, Wendy S; Shah, Yogesh P; Roberts, Candace

    2010-02-01

    The objective of this article is to evaluate the role of computed tomography (CT) in a pregnant patient with right lower quadrant pain in whom there was a clinical suspicion of acute appendicitis. During a 5-year period the clinical records of all pregnant women who underwent imaging examination for clinically suspected appendicitis were reviewed. The imaging findings were correlated with patient management and final outcome. Thirty-nine pregnant patients were referred for imaging, of which 35 underwent initial evaluation with sonography, 23 of these women underwent a computed tomographic examination, and an additional 4 patients were directly imaged with CT without earlier sonographic assessment. Surgery confirmed appendicitis in all 5 patients who were operated on on the basis of findings of appendicitis on a CT scan. Two patients underwent surgery based on an alternate diagnosis suggested preoperatively (tubal torsion = 1, ovarian torsion = 1). All patients with negative findings at CT had an uneventful clinical course. In those patients who were evaluated only with ultrasound, a diagnosis of appendicitis was missed in 5 patients. The sensitivity of CT in the diagnosis of appendicitis in our study group was 100%, compared with a sensitivity of 46.1% for ultrasound. CT provides an accurate diagnosis in patients suspected to have acute appendicitis and is of value in avoiding false negative exploratory laparatomy with its consequent risk of maternal and fetal mortality and morbidity. Although sonography is the preferred initial imaging modality as its lack of ionizing radiation, CT is more accurate in providing a timely diagnosis and its use is justified to reduce maternal mortality and mortality in patients with appendicitis.

  16. The effects of computer assisted physics experiment simulations on students' learning

    Directory of Open Access Journals (Sweden)

    Turhan Civelek

    2013-11-01

    Full Text Available The main goal of this study is to present the significant difference between utilization of simulations of physics experiment during lectures and traditional physics lecture. Two groups of 115 students were selected for the purpose of the study. The same subjects have been taught to both groups, while a group of 115 had their lectures in science and technology class supported by physics experiment simulations for a month, the other group of115 had their lectures ina traditional way. The research has been conducted in Izzet Unver highs school in Istanbul, Gungoren. The main resource of this research is the data collected through surveys. The survey is a result of the literature and the suggestions of the experts on the topic. Thirty questions were prepared under ten topics. Two different surveys were conducted during the data collection. While the first survey questions focused on the effects of traditional lecturing on students, the second survey questions were targeting the effects of lecturing via the support of psychics experiment simulations. The data collected as a result of the survey which was coded in to SPSS Software and statistical anal yses was conducted. In order to test the significant difference between the means t-test was utilized. 0.05 was chosen as the significance level. As a result of the analyses utilized, significant differences were found in their satisfaction on class materials, in their motivation, in their learning speed, in their interest in the class, and in their contribution to the class. In findings such as the effect on students’ learning, information availability, organization of information, students’ integration to the class and gaining different point of views “lectures supported by physics experiment simulations” is significantly different from traditional lecturing. As the result of the literature review and the statistical analyses, “lectures supported via physics experiment simulations” seem to

  17. New strategies of the LHC experiments to meet the computing requirements of the HL-LHC era

    CERN Document Server

    Adamova, Dagmar

    2017-01-01

    The performance of the Large Hadron Collider (LHC) during the ongoing Run 2 is above expectations both concerning the delivered luminosity and the LHC live time. This resulted in a volume of data much larger than originally anticipated. Based on the current data production levels and the structure of the LHC experiment computing models, the estimates of the data production rates and resource needs were re-evaluated for the era leading into the High Luminosity LHC (HLLHC), the Run 3 and Run 4 phases of LHC operation. It turns out that the raw data volume will grow 10 times by the HL-LHC era and the processing capacity needs will grow more than 60 times. While the growth of storage requirements might in principle be satisfied with a 20 per cent budget increase and technology advancements, there is a gap of a factor 6 to 10 between the needed and available computing resources. The threat of a lack of computing and storage resources was present already in the beginning of Run 2, but could still be mitigated, e.g....

  18. Effect of reader experience on variability, evaluation time and accuracy of coronary plaque detection with computed tomography coronary angiography

    Energy Technology Data Exchange (ETDEWEB)

    Saur, Stefan C.; Szekely, Gabor [ETH Zurich, Computer Vision Laboratory, Zurich (Switzerland); Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Massachusetts General Hospital and Harvard Medical School, Cardiac MR PET CT Group, Boston, MA (United States); Stolzmann, Paul; Baumueller, Stephan; Leschka, Sebastian; Scheffel, Hans; Desbiolles, Lotus [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Fuchs, Thomas J. [ETH Zurich, Department of Computer Science, Zurich (Switzerland); Cattin, Philippe C. [ETH Zurich, Computer Vision Laboratory, Zurich (Switzerland); University of Basel, Medical Image Analysis Center, Basel (Switzerland)

    2010-07-15

    To assess the effect of reader experience on variability, evaluation time and accuracy in the detection of coronary artery plaques with computed tomography coronary angiography (CTCA). Three independent, blinded readers with three different experience levels twice labelled 50 retrospectively electrocardiography (ECG)-gated contrast-enhanced dual-source CTCA data sets (15 female, age 67.3 {+-} 10.4 years, range 46-86 years) indicating the presence or absence of coronary plaques. The evaluation times for the readings were recorded. Intra- and interobserver variability expressed as {kappa} statistics and sensitivity, specificity, and negative and positive predictive values were calculated for plaque detection, with a consensus reading of the three readers taken as the standard of reference. A bootstrap method was applied in the statistical analysis to account for clustering. Significant correlations were found between reader experience and, respectively, evaluation times (r = -0.59, p < 0.05) and intraobserver variability (r = 0.73, p < 0.05). The evaluation time significantly differed among the readers (p < 0.05). The observer variability for plaque detection, compared with the consensus, varied between {kappa} = 0.582 and {kappa} = 0.802. Variability of plaque detection was significantly smaller (p < 0.05) and more accurate (p < 0.05) for the most experienced reader. Reader experience significantly correlated with observer variability, evaluation time and accuracy of coronary plaque detection at CTCA. (orig.)

  19. Teaching and new technologies: change of time and teaching experience by the use of computer and internet

    Directory of Open Access Journals (Sweden)

    Domenica Martinez

    2017-05-01

    Full Text Available This article aimed to answer how the use of computers and the Internet, directly and indirectly, has affected the time of teaching work and how these tool have determined the experience of teachers. Sources of research consisted of doctoral and master theses, as well as official documents that referring to these concepts and instruments. The survey instrument was developed as a form, using a software for the collection and analysis of data; for the assessment of the results we used the content analysis technique. The results show the relationship between time and experience under different cultural conventions that permeate the school, consolidating awareness under contradictions supported by the potential of new technologies in a condition of atrophying the experience to which the training process and the work of the teacher would tend to be reduced. The method is achieved from critical theory, especially the concepts of experience indicated by Adorno and Benjamin, ideology of technological rationality and industrial society, by Marcuse, and key concepts of dialectical materialism, such as alienated labor, commodity and machinery, from Marx.

  20. Multifrequency and edge breathers in the discrete sine-Gordon system via subharmonic driving: Theory, computation and experiment

    Energy Technology Data Exchange (ETDEWEB)

    Palmero, F. [Grupo de Física No Lineal, Departamento de Física Aplicada I, ETSI Informática, Universidad de Sevilla, Avda. Reina Mercedes, s/n, 41012 Sevilla (Spain); Department of Physics and Astronomy, Dickinson College, Carlisle, PA 17013 (United States); Han, J. [Department of Physics and Astronomy, Dickinson College, Carlisle, PA 17013 (United States); English, L.Q., E-mail: englishl@dickinson.edu [Department of Physics and Astronomy, Dickinson College, Carlisle, PA 17013 (United States); Alexander, T.J. [School of Physical, Environmental and Mathematical Sciences, UNSW Canberra, 2610 (Australia); Kevrekidis, P.G. [Department of Mathematics and Statistics, University of Massachusetts, Amherst, MA 01003-4515 (United States); Center for Nonlinear Studies and Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87544 (United States)

    2016-01-28

    We consider a chain of torsionally-coupled, planar pendula shaken horizontally by an external sinusoidal driver. It has been known that in such a system, theoretically modeled by the discrete sine-Gordon equation, intrinsic localized modes, also known as discrete breathers, can exist. Recently, the existence of multifrequency breathers via subharmonic driving has been theoretically proposed and numerically illustrated by Xu et al. (2014) [21]. In this paper, we verify this prediction experimentally. Comparison of the experimental results to numerical simulations with realistic system parameters (including a Floquet stability analysis), and wherever possible to analytical results (e.g. for the subharmonic response of the single driven–damped pendulum), yields good agreement. Finally, we report the period-1 and multifrequency edge breathers which are localized at the open boundaries of the chain, for which we have again found good agreement between experiments and numerical computations. - Highlights: • We have confirmed experimentally the existence of subharmonic ILM/breather structures in a chain of coupled torsion pendula. • Experiments are in line with our theoretical analysis and numerical computations. • We have also revealed surface breather modes in this chain, both experimentally and numerically.