WorldWideScience

Sample records for previous computer experience

  1. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  2. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  3. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  4. Specific Previous Experience Affects Perception of Harmony and Meter

    Science.gov (United States)

    Creel, Sarah C.

    2011-01-01

    Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was…

  5. Robotic colorectal surgery: previous laparoscopic colorectal experience is not essential.

    Science.gov (United States)

    Sian, Tanvir Singh; Tierney, G M; Park, H; Lund, J N; Speake, W J; Hurst, N G; Al Chalabi, H; Smith, K J; Tou, S

    2018-06-01

    A background in minimally invasive colorectal surgery (MICS) has been thought to be essential prior to robotic-assisted colorectal surgery (RACS). Our aim was to determine whether MICS is essential prior to starting RACS training based on results from our initial experience with RACS. Two surgeons from our centre received robotic training through the European Academy of Robotic Colorectal Surgery (EARCS). One surgeon had no prior formal MICS training. We reviewed the first 30 consecutive robotic colorectal procedures from a prospectively maintained database between November 2014 and January 2016 at our institution. Fourteen patients were male. Median age was 64.5 years (range 36-82) and BMI was 27.5 (range 20-32.5). Twelve procedures (40%) were performed by the non-MICS-trained surgeon: ten high anterior resections (one conversion), one low anterior resection and one abdomino-perineal resection of rectum (APER). The MICS-trained surgeon performed nine high and four low anterior resections, one APER and in addition three right hemicolectomies and one abdominal suture rectopexy. There were no intra-operative complications and two patients required re-operation. Median post-operative stay was five days (range 1-26). There were two 30-day re-admissions. All oncological resections had clear margins and median node harvest was 18 (range 9-39). Our case series demonstrates that a background in MICS is not essential prior to starting RACS training. Not having prior MICS training should not discourage surgeons from considering applying for a robotic training programme. Safe and successful robotic colorectal services can be established after completing a formal structured robotic training programme.

  6. COMPUTER CONTROL OF BEHAVIORAL EXPERIMENTS.

    Science.gov (United States)

    SIEGEL, LOUIS

    THE LINC COMPUTER PROVIDES A PARTICULAR SCHEDULE OF REINFORCEMENT FOR BEHAVIORAL EXPERIMENTS BY EXECUTING A SEQUENCE OF COMPUTER OPERATIONS IN CONJUNCTION WITH A SPECIALLY DESIGNED INTERFACE. THE INTERFACE IS THE MEANS OF COMMUNICATION BETWEEN THE EXPERIMENTAL CHAMBER AND THE COMPUTER. THE PROGRAM AND INTERFACE OF AN EXPERIMENT INVOLVING A PIGEON…

  7. Pharmacology Experiments on the Computer.

    Science.gov (United States)

    Keller, Daniel

    1990-01-01

    A computer program that replaces a set of pharmacology and physiology laboratory experiments on live animals or isolated organs is described and illustrated. Five experiments are simulated: dose-effect relationships on smooth muscle, blood pressure and catecholamines, neuromuscular signal transmission, acetylcholine and the circulation, and…

  8. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of

  9. Computing for an SSC experiment

    International Nuclear Information System (INIS)

    Gaines, I.

    1993-01-01

    The hardware and software problems for SSC experiments are similar to those faced by present day experiments but larger in scale. In particular, the Solenoidal Detector Collaboration (SDC) anticipates the need for close to 10**6 MIPS of off-line computing and will produce several Petabytes (10**15 bytes) of data per year. Software contributions will be made from large numbers of highly geographically dispersed physicists. Hardware and software architectures to meet these needs have been designed. Providing the requisites amount of computing power and providing tools to allow cooperative software development using extensions of existing techniques look achievable. The major challenges will be to provide efficient methods of accessing and manipulating the enormous quantities of data that will be produced at the SSC, and to enforce the use of software engineering tools that will ensure the open-quotes correctnessclose quotes of experiment critical software

  10. Influence of previous experience on resistance training on reliability of one-repetition maximum test.

    Science.gov (United States)

    Ritti-Dias, Raphael Mendes; Avelar, Ademar; Salvador, Emanuel Péricles; Cyrino, Edilson Serpeloni

    2011-05-01

    The 1-repetition maximum test (1RM) has been widely used to assess maximal strength. However, to improve accuracy in assessing maximal strength, several sessions of the 1RM test are recommended. The aim of this study was to analyze the influence of previous resistance training experience on the reliability of 1RM test. Thirty men were assigned to the following 2 groups according to their previous resistance training experience: no previous resistance training experience (NOEXP) and more than 24 months of resistance training experience (EXP). All subjects performed the 1RM tests in bench press and squat in 4 sessions on distinct days. There was a significant session × group effect in bench press (F = 3.09; p reliability of the 1RM test is influenced by the subject's previous experience in resistance training. Subjects without experience in resistance training require more practice and familiarization and show greater increases in maximal strength between sessions than subjects with previous experience in resistance training.

  11. Analysis of previous perceptual and motor experience in breaststroke kick learning

    Directory of Open Access Journals (Sweden)

    Ried Bettina

    2015-12-01

    Full Text Available One of the variables that influence motor learning is the learner’s previous experience, which may provide perceptual and motor elements to be transferred to a novel motor skill. For swimming skills, several motor experiences may prove effective. Purpose. The aim was to analyse the influence of previous experience in playing in water, swimming lessons, and music or dance lessons on learning the breaststroke kick. Methods. The study involved 39 Physical Education students possessing basic swimming skills, but not the breaststroke, who performed 400 acquisition trials followed by 50 retention and 50 transfer trials, during which stroke index as well as rhythmic and spatial configuration indices were mapped, and answered a yes/no questionnaire regarding previous experience. Data were analysed by ANOVA (p = 0.05 and the effect size (Cohen’s d ≥0.8 indicating large effect size. Results. The whole sample improved their stroke index and spatial configuration index, but not their rhythmic configuration index. Although differences between groups were not significant, two types of experience showed large practical effects on learning: childhood water playing experience only showed major practically relevant positive effects, and no experience in any of the three fields hampered the learning process. Conclusions. The results point towards diverse impact of previous experience regarding rhythmic activities, swimming lessons, and especially with playing in water during childhood, on learning the breaststroke kick.

  12. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  13. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  14. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  15. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  16. Impact of Vocational Interests, Previous Academic Experience, Gender and Age on Situational Judgement Test Performance

    Science.gov (United States)

    Schripsema, Nienke R.; van Trigt, Anke M.; Borleffs, Jan C. C.; Cohen-Schotanus, Janke

    2017-01-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the…

  17. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance

    NARCIS (Netherlands)

    Schripsema, Nienke R.; Trigt, van Anke M.; Borleffs, Jan C. C.; Cohen-Schotanus, Janke

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree

  18. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  19. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  20. Impact of previous pharmacy work experience on pharmacy school academic performance.

    Science.gov (United States)

    Mar, Ellena; Barnett, Mitchell J; T-L Tang, Terrill; Sasaki-Hill, Debra; Kuperberg, James R; Knapp, Katherine

    2010-04-12

    To determine whether students' previous pharmacy-related work experience was associated with their pharmacy school performance (academic and clinical). The following measures of student academic performance were examined: pharmacy grade point average (GPA), scores on cumulative high-stakes examinations, and advanced pharmacy practice experience (APPE) grades. The quantity and type of pharmacy-related work experience each student performed prior to matriculation was solicited through a student survey instrument. Survey responses were correlated with academic measures, and demographic-based stratified analyses were conducted. No significant difference in academic or clinical performance between those students with prior pharmacy experience and those without was identified. Subanalyses by work setting, position type, and substantial pharmacy work experience did not reveal any association with student performance. A relationship was found, however, between age and work experience, ie, older students tended to have more work experience than younger students. Prior pharmacy work experience did not affect students' overall academic or clinical performance in pharmacy school. The lack of significant findings may have been due to the inherent practice limitations of nonpharmacist positions, changes in pharmacy education, and the limitations of survey responses.

  1. Impact of Previous Pharmacy Work Experience on Pharmacy School Academic Performance

    Science.gov (United States)

    Mar, Ellena; T-L Tang, Terrill; Sasaki-Hill, Debra; Kuperberg, James R.; Knapp, Katherine

    2010-01-01

    Objectives To determine whether students' previous pharmacy-related work experience was associated with their pharmacy school performance (academic and clinical). Methods The following measures of student academic performance were examined: pharmacy grade point average (GPA), scores on cumulative high-stakes examinations, and advanced pharmacy practice experience (APPE) grades. The quantity and type of pharmacy-related work experience each student performed prior to matriculation was solicited through a student survey instrument. Survey responses were correlated with academic measures, and demographic-based stratified analyses were conducted. Results No significant difference in academic or clinical performance between those students with prior pharmacy experience and those without was identified. Subanalyses by work setting, position type, and substantial pharmacy work experience did not reveal any association with student performance. A relationship was found, however, between age and work experience, ie, older students tended to have more work experience than younger students. Conclusions Prior pharmacy work experience did not affect students' overall academic or clinical performance in pharmacy school. The lack of significant findings may have been due to the inherent practice limitations of nonpharmacist positions, changes in pharmacy education, and the limitations of survey responses. PMID:20498735

  2. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    Science.gov (United States)

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, pemotional intelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  3. Age, training, and previous experience predict race performance in long-distance inline skaters, not anthropometry.

    Science.gov (United States)

    Knechtle, Beat; Knechtle, Patrizia; Rüst, Christoph Alexander; Rosemann, Thomas; Lepers, Romuald

    2012-02-01

    The association of characteristics of anthropometry, training, and previous experience with race time in 84 recreational, long-distance, inline skaters at the longest inline marathon in Europe (111 km), the Inline One-eleven in Switzerland, was investigated to identify predictor variables for performance. Age, duration per training unit, and personal best time were the only three variables related to race time in a multiple regression, while none of the 16 anthropometric variables were related. Anthropometric characteristics seem to be of no importance for a fast race time in a long-distance inline skating race in contrast to training volume and previous experience, when controlled with covariates. Improving performance in a long-distance inline skating race might be related to a high training volume and previous race experience. Also, doing such a race requires a parallel psychological effort, mental stamina, focus, and persistence. This may be reflected in the preparation and training for the event. Future studies should investigate what motivates these athletes to train and compete.

  4. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance.

    Science.gov (United States)

    Schripsema, Nienke R; van Trigt, Anke M; Borleffs, Jan C C; Cohen-Schotanus, Janke

    2017-05-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the Netherlands. All applicants for the academic year 2015-2016 were included and had to choose between learning communities Global Health (n = 126), Sustainable Care (n = 149), Intramural Care (n = 225), or Molecular Medicine (n = 116). This choice was used as a proxy for vocational interest. In addition, all graduate-entry applicants for academic year 2015-2016 (n = 213) were included to examine the effect of previous academic experience on performance. We used MANCOVA analyses with Bonferroni post hoc multiple comparisons tests for applicant performance on a six-scenario SJT. The MANCOVA analyses showed that for all scenarios, the independent variables were significantly related to performance (Pillai's Trace: 0.02-0.47, p performance on three scenarios (p performance on two scenarios (p performance, as was previous academic experience. Gender and age were related to performance on SJT scenarios in different settings. Especially the first effect might be helpful in selecting appropriate candidates for areas of health care in which more professionals are needed.

  5. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  6. Previous experience in manned space flight: A survey of human factors lessons learned

    Science.gov (United States)

    Chandlee, George O.; Woolford, Barbara

    1993-01-01

    Previous experience in manned space flight programs can be used to compile a data base of human factors lessons learned for the purpose of developing aids in the future design of inhabited spacecraft. The objectives are to gather information available from relevant sources, to develop a taxonomy of human factors data, and to produce a data base that can be used in the future for those people involved in the design of manned spacecraft operations. A study is currently underway at the Johnson Space Center with the objective of compiling, classifying, and summarizing relevant human factors data bearing on the lessons learned from previous manned space flights. The research reported defines sources of data, methods for collection, and proposes a classification for human factors data that may be a model for other human factors disciplines.

  7. "My math and me": Nursing students' previous experiences in learning mathematics.

    Science.gov (United States)

    Røykenes, Kari

    2016-01-01

    In this paper, 11 narratives about former experiences in learning of mathematics written by nursing students are thematically analyzed. Most students had a positive relationship with the subject in primary school, when they found mathematics fun and were able to master the subject. For some, a change occurred in the transition to lower secondary school. The reasons for this change was found in the subject (increased difficulty), the teachers (movement of teachers, numerous substitute teachers), the class environment and size (many pupils, noise), and the student him- or herself (silent and anonymous pupil). This change was also found in the transition from lower to higher secondary school. By contrast, some students had experienced changes that were positive, and their mathematics teacher was a significant factor in this positive change. The paper emphasizes the importance of previous experiences in learning mathematics to nursing students when learning about drug calculation. Copyright © 2015. Published by Elsevier Ltd.

  8. Generalized Bell-inequality experiments and computation

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Department of Computer Science, University of Oxford, Wolfson Building, Parks Road, Oxford OX1 3QD (United Kingdom); Wallman, Joel J. [School of Physics, The University of Sydney, Sydney, New South Wales 2006 (Australia); Browne, Dan E. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2011-12-15

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  9. Generalized Bell-inequality experiments and computation

    International Nuclear Information System (INIS)

    Hoban, Matty J.; Wallman, Joel J.; Browne, Dan E.

    2011-01-01

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.

  10. Differences between previously married and never married 'gay' men: family background, childhood experiences and current attitudes.

    Science.gov (United States)

    Higgins, Daryl J

    2004-01-01

    Despite a large body of literature on the development of sexual orientation, little is known about why some gay men have been (or remain) married to a woman. In the current study, a self-selected sample of 43 never married gay men ('never married') and 26 gay men who were married to a woman ('previously married') completed a self-report questionnaire. Hypotheses were based on five possible explanations for gay men's marriages: (a) differences in sexual orientation (i.e., bisexuality); (b) internalized homophobia; (c) religious intolerance; (d) confusion created because of childhood/adolescent sexual experiences; and/or (e) poor psychological adjustment. Previously married described their families' religious beliefs as more fundamentalist than never married. No differences were found between married' and never married' ratings of their sexual orientation and identity, and levels of homophobia and self-depreciation. Family adaptability and family cohesion and the degree to which respondents reported having experienced child maltreatment did not distinguish between previously married and never married. The results highlight how little is understood of the reasons why gay men marry, and the need to develop an adequate theoretical model.

  11. Does previous open surgical experience have any influence on robotic surgery simulation exercises?

    Science.gov (United States)

    Cumpanas, Alin Adrian; Bardan, Razvan; Ferician, Ovidiu Catalin; Latcu, Silviu Constantin; Duta, Ciprian; Lazar, Fulger Octavian

    2017-12-01

    Within the last years, there has been a trend in many hospitals to switch their surgical activity from open/laparoscopic procedures to robotic surgery. Some open surgeons have been shifting their activity to robotic surgery. It is still unclear whether there is a transfer of open surgical skills to robotic ones. To evaluate whether such transfer of skills occurs and to identify which specific skills are more significantly transferred from the operative table to the console. Twenty-five volunteers were included in the study, divided into 2 groups: group A (15 participants) - medical students (without any surgical experience in open, laparoscopic or robotic surgery); and group B (10 participants) - surgeons with exclusively open surgical experience, without any previous laparoscopic or robotic experience. Participants were asked to complete 3 robotic simulator console exercises structured from the easiest one (Peg Board) to the toughest one (Sponge Suture). Overall scores for each exercise as well as specific metrics were compared between the two groups. There were no significant differences between overall scores of the two groups for the easiest task. Overall scores were better for group B as the exercises got more complex. For the intermediate and high-difficulty level exercises, most of the specific metrics were better for group B, with the exception of the working master space item. Our results suggest that the open surgical skills transfer to robotic skills, at least for the very beginning of the training process.

  12. Reciprocity, culture and human cooperation: previous insights and a new cross-cultural experiment.

    Science.gov (United States)

    Gächter, Simon; Herrmann, Benedikt

    2009-03-27

    Understanding the proximate and ultimate sources of human cooperation is a fundamental issue in all behavioural sciences. In this paper, we review the experimental evidence on how people solve cooperation problems. Existing studies show without doubt that direct and indirect reciprocity are important determinants of successful cooperation. We also discuss the insights from a large literature on the role of peer punishment in sustaining cooperation. The experiments demonstrate that many people are 'strong reciprocators' who are willing to cooperate and punish others even if there are no gains from future cooperation or any other reputational gains. We document this in new one-shot experiments, which we conducted in four cities in Russia and Switzerland. Our cross-cultural approach allows us furthermore to investigate how the cultural background influences strong reciprocity. Our results show that culture has a strong influence on positive and in especially strong negative reciprocity. In particular, we find large cross-cultural differences in 'antisocial punishment' of pro-social cooperators. Further cross-cultural research and experiments involving different socio-demographic groups document that the antisocial punishment is much more widespread than previously assumed. Understanding antisocial punishment is an important task for future research because antisocial punishment is a strong inhibitor of cooperation.

  13. Computing challenges of the CMS experiment

    International Nuclear Information System (INIS)

    Krammer, N.; Liko, D.

    2017-01-01

    The success of the LHC experiments is due to the magnificent performance of the detector systems and the excellent operating computing systems. The CMS offline software and computing system is successfully fulfilling the LHC Run 2 requirements. For the increased data rate of future LHC operation, together with high pileup interactions, improvements of the usage of the current computing facilities and new technologies became necessary. Especially for the challenge of the future HL-LHC a more flexible and sophisticated computing model is needed. In this presentation, I will discuss the current computing system used in the LHC Run 2 and future computing facilities for the HL-LHC runs using flexible computing technologies like commercial and academic computing clouds. The cloud resources are highly virtualized and can be deployed for a variety of computing tasks providing the capacities for the increasing needs of large scale scientific computing.

  14. Sharing experience and knowledge with wearable computers

    OpenAIRE

    Nilsson, Marcus; Drugge, Mikael; Parnes, Peter

    2004-01-01

    Wearable computer have mostly been looked on when used in isolation. But the wearable computer with Internet connection is a good tool for communication and for sharing knowledge and experience with other people. The unobtrusiveness of this type of equipment makes it easy to communicate at most type of locations and contexts. The wearable computer makes it easy to be a mediator of other people knowledge and becoming a knowledgeable user. This paper describes the experience gained from testing...

  15. The relationship of previous training and experience of journal peer reviewers to subsequent review quality.

    Directory of Open Access Journals (Sweden)

    Michael L Callaham

    2007-01-01

    Full Text Available BACKGROUND: Peer review is considered crucial to the selection and publication of quality science, but very little is known about the previous experiences and training that might identify high-quality peer reviewers. The reviewer selection processes of most journals, and thus the qualifications of their reviewers, are ill defined. More objective selection of peer reviewers might improve the journal peer review process and thus the quality of published science. METHODS AND FINDINGS: 306 experienced reviewers (71% of all those associated with a specialty journal completed a survey of past training and experiences postulated to improve peer review skills. Reviewers performed 2,856 reviews of 1,484 separate manuscripts during a four-year study period, all prospectively rated on a standardized quality scale by editors. Multivariable analysis revealed that most variables, including academic rank, formal training in critical appraisal or statistics, or status as principal investigator of a grant, failed to predict performance of higher-quality reviews. The only significant predictors of quality were working in a university-operated hospital versus other teaching environment and relative youth (under ten years of experience after finishing training. Being on an editorial board and doing formal grant (study section review were each predictors for only one of our two comparisons. However, the predictive power of all variables was weak. CONCLUSIONS: Our study confirms that there are no easily identifiable types of formal training or experience that predict reviewer performance. Skill in scientific peer review may be as ill defined and hard to impart as is "common sense." Without a better understanding of those skills, it seems unlikely journals and editors will be successful in systematically improving their selection of reviewers. This inability to predict performance makes it imperative that all but the smallest journals implement routine review ratings

  16. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  17. ATLAS Distributed Computing: Experience and Evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2013-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb-1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centers around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics program including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2014 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  18. ATLAS distributed computing: experience and evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25/fb of data. The total volume of beam and simulated data products exceeds 100~PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  19. Do previous sports experiences influence the effect of an enrichment programme in basketball skills?

    Science.gov (United States)

    Santos, Sara; Mateus, Nuno; Sampaio, Jaime; Leite, Nuno

    2017-09-01

    The aim of this study was to examine the effect of an enrichment programme in motor, technical and tactical basketball skills, when accounting for the age of youth sport specialisation. Seventy-six college students (age: M = 20.4, SD = 1.9) were allocated according to three different paths: (i) non-structured (n = 14), (ii) early specialisation (n = 34), and (iii) late specialisation (n = 28), according to information previously provided by the participants about the quantity and type of sporting activities performed throughout their sporting careers. Then, the participants of each path were randomly distributed across control and experimental groups. Variables under study included agility, technical skills circuit, as well as tactical actions performed in a 4-on-4 full-court basketball game. The results indicated improvements in the early and late specialisation paths namely in the experimental training groups. However, the late specialisation path revealed larger benefits, in contrast with the non-structured path, which showed less sensitivity to the enrichment programme, mostly sustained in physical literacy and differential learning. Higher improvements were observed in agility, and also in reducing the number of unsuccessful actions performed during the game. Overall, this study provided evidence of how early sports experiences affect basketball skill acquisition and contribute to adapt to new contexts with motor and technical-tactical challenges. In addition, a path supported by late specialisation might present several advantages in sport performance achievement.

  20. Mental Rotation Ability and Computer Game Experience

    Science.gov (United States)

    Gecu, Zeynep; Cagiltay, Kursat

    2015-01-01

    Computer games, which are currently very popular among students, can affect different cognitive abilities. The purpose of the present study is to examine undergraduate students' experiences and preferences in playing computer games as well as their mental rotation abilities. A total of 163 undergraduate students participated. The results showed a…

  1. ATLAS distributed computing: experience and evolution

    International Nuclear Information System (INIS)

    Nairz, A

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb −1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, energies and event complexities. An essential requirement will be the efficient utilisation of current and future processor technologies as well as a broad range of computing platforms, including supercomputing and cloud resources. We will report on experience gained thus far and our progress in preparing ATLAS computing for the future

  2. TU-CD-BRD-01: Making Incident Learning Practical and Useful: Challenges and Previous Experiences

    International Nuclear Information System (INIS)

    Ezzell, G.

    2015-01-01

    It has long been standard practice in radiation oncology to report internally when a patient’s treatment has not gone as planned and to report events to regulatory agencies when legally required. Most potential errors are caught early and never affect the patient. Quality assurance steps routinely prevent errors from reaching the patient, and these “near misses” are much more frequent than treatment errors. A growing number of radiation oncology facilities have implemented incident learning systems to report and analyze both errors and near misses. Using the term “incident learning” instead of “event reporting” emphasizes the need to use these experiences to change the practice and make future errors less likely and promote an educational, non-punitive environment. There are challenges in making such a system practical and effective. Speakers from institutions of different sizes and practice environments will share their experiences on how to make such a system work and what benefits their clinics have accrued. Questions that will be addressed include: How to create a system that is easy for front line staff to access How to motivate staff to report How to promote the system as positive and educational and not punitive or demeaning How to organize the team for reviewing and responding to reports How to prioritize which reports to discuss in depth How not to dismiss the rest How to identify underlying causes How to design corrective actions and implement change How to develop useful statistics and analysis tools How to coordinate a departmental system with a larger risk management system How to do this without a dedicated quality manager Some speakers’ experience is with in-house systems and some will share experience with the AAPM/ASTRO national Radiation Oncology Incident Learning System (RO-ILS). Reports intended to be of value nationally need to be comprehensible to outsiders; examples of useful reports will be shown. There will be ample time set

  3. TU-CD-BRD-01: Making Incident Learning Practical and Useful: Challenges and Previous Experiences

    Energy Technology Data Exchange (ETDEWEB)

    Ezzell, G. [Mayo Clinic Arizona (United States)

    2015-06-15

    It has long been standard practice in radiation oncology to report internally when a patient’s treatment has not gone as planned and to report events to regulatory agencies when legally required. Most potential errors are caught early and never affect the patient. Quality assurance steps routinely prevent errors from reaching the patient, and these “near misses” are much more frequent than treatment errors. A growing number of radiation oncology facilities have implemented incident learning systems to report and analyze both errors and near misses. Using the term “incident learning” instead of “event reporting” emphasizes the need to use these experiences to change the practice and make future errors less likely and promote an educational, non-punitive environment. There are challenges in making such a system practical and effective. Speakers from institutions of different sizes and practice environments will share their experiences on how to make such a system work and what benefits their clinics have accrued. Questions that will be addressed include: How to create a system that is easy for front line staff to access How to motivate staff to report How to promote the system as positive and educational and not punitive or demeaning How to organize the team for reviewing and responding to reports How to prioritize which reports to discuss in depth How not to dismiss the rest How to identify underlying causes How to design corrective actions and implement change How to develop useful statistics and analysis tools How to coordinate a departmental system with a larger risk management system How to do this without a dedicated quality manager Some speakers’ experience is with in-house systems and some will share experience with the AAPM/ASTRO national Radiation Oncology Incident Learning System (RO-ILS). Reports intended to be of value nationally need to be comprehensible to outsiders; examples of useful reports will be shown. There will be ample time set

  4. Previous experience of family violence and intimate partner violence in pregnancy.

    Science.gov (United States)

    Ludermir, Ana Bernarda; Araújo, Thália Velho Barreto de; Valongueiro, Sandra Alves; Muniz, Maria Luísa Corrêa; Silva, Elisabete Pereira

    2017-01-01

    To estimate differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. A nested case-control study was carried out within a cohort study with 1,120 pregnant women aged 18-49 years old, who were registered in the Family Health Strategy of the city of Recife, State of Pernambuco, Brazil, between 2005 and 2006. The cases were the 233 women who reported intimate partner violence in pregnancy and the controls were the 499 women who did not report it. Partner violence in pregnancy and previous experiences of violence committed by parents or other family members were assessed with a standardized questionnaire. Multivariate logistic regression analyses were modeled to identify differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. Having seen the mother suffer intimate partner violence was associated with physical violence in childhood (OR = 2.62; 95%CI 1.89-3.63) and in adolescence (OR = 1.47; 95%CI 1.01-2.13), sexual violence in childhood (OR = 3.28; 95%CI 1.68-6.38) and intimate partner violence during pregnancy (OR = 1.47; 95% CI 1.01 - 2.12). The intimate partner violence during pregnancy was frequent in women who reported more episodes of physical violence in childhood (OR = 2.08; 95%CI 1.43-3.02) and adolescence (OR = 1.63; 95%CI 1.07-2.47), who suffered sexual violence in childhood (OR = 3.92; 95%CI 1.86-8.27), and who perpetrated violence against the partner (OR = 8.67; 95%CI 4.57-16.45). Experiences of violence committed by parents or other family members emerge as strong risk factors for intimate partner violence in pregnancy. Identifying and understanding protective and risk factors for the emergence of intimate partner violence in pregnancy and its maintenance may help policymakers and health service managers to develop intervention strategies.

  5. Previous experience of family violence and intimate partner violence in pregnancy

    Directory of Open Access Journals (Sweden)

    Ana Bernarda Ludermir

    2017-09-01

    Full Text Available ABSTRACT OBJECTIVE To estimate differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. METHODS A nested case-control study was carried out within a cohort study with 1,120 pregnant women aged 18–49 years old, who were registered in the Family Health Strategy of the city of Recife, State of Pernambuco, Brazil, between 2005 and 2006. The cases were the 233 women who reported intimate partner violence in pregnancy and the controls were the 499 women who did not report it. Partner violence in pregnancy and previous experiences of violence committed by parents or other family members were assessed with a standardized questionnaire. Multivariate logistic regression analyses were modeled to identify differential associations between the exposure to violence in the family of origin and victimization and perpetration of intimate partner violence in pregnancy. RESULTS Having seen the mother suffer intimate partner violence was associated with physical violence in childhood (OR = 2.62; 95%CI 1.89–3.63 and in adolescence (OR = 1.47; 95%CI 1.01–2.13, sexual violence in childhood (OR = 3.28; 95%CI 1.68–6.38 and intimate partner violence during pregnancy (OR = 1.47; 95% CI 1.01 – 2.12. The intimate partner violence during pregnancy was frequent in women who reported more episodes of physical violence in childhood (OR = 2.08; 95%CI 1.43–3.02 and adolescence (OR = 1.63; 95%CI 1.07–2.47, who suffered sexual violence in childhood (OR = 3.92; 95%CI 1.86–8.27, and who perpetrated violence against the partner (OR = 8.67; 95%CI 4.57–16.45. CONCLUSIONS Experiences of violence committed by parents or other family members emerge as strong risk factors for intimate partner violence in pregnancy. Identifying and understanding protective and risk factors for the emergence of intimate partner violence in pregnancy and its maintenance may help

  6. The Computer Game as a Somatic Experience

    DEFF Research Database (Denmark)

    Nielsen, Henrik Smed

    2010-01-01

    This article describes the experience of playing computer games. With a media archaeological outset the relation between human and machine is emphasised as the key to understand the experience. This relation is further explored by drawing on a phenomenological philosophy of technology which...

  7. Computing and data handling recent experiences at Fermilab and SLAC

    International Nuclear Information System (INIS)

    Cooper, P.S.

    1990-01-01

    Computing has become evermore central to the doing of high energy physics. There are now major second and third generation experiments for which the largest single cost is computing. At the same time the availability of ''cheap'' computing has made possible experiments which were previously considered infeasible. The result of this trend has been an explosion of computing and computing needs. I will review here the magnitude of the problem, as seen at Fermilab and SLAC, and the present methods for dealing with it. I will then undertake the dangerous assignment of projecting the needs and solutions forthcoming in the next few years at both laboratories. I will concentrate on the ''offline'' problem; the process of turning terabytes of data tapes into pages of physics journals. 5 refs., 4 figs., 4 tabs

  8. The Impact of an International Cultural Experience on Previously Held Stereotypes by American Student Nurses.

    Science.gov (United States)

    Heuer, Loretta; Bengiamin, Marlene; Downey, Vicki Wessman

    2001-01-01

    Examined stereotypes held by U.S. student nurses before and after participating in an educational experience in Russia. The experience was intended to prepare them to be effective nurses in multicultural health care settings. Data from student interviews indicated that the experience changed students' stereotyped attitudes about Russian culture…

  9. Previous Experiences with Epilepsy and Effectiveness of Information to Change Public Perception of Epilepsy

    NARCIS (Netherlands)

    Gutteling, Jan M.; Seydel, E.R.; Wiegman, O.

    1986-01-01

    Differences with regard to the effectiveness of health information and attitude change are suggested between people with direct, behavioral experiences with a health topic and people with indirect, nonbehavioral experiences. The effects of three different methods of health education about epilepsy,

  10. Study of some physical aspects previous to design of an exponential experiment

    International Nuclear Information System (INIS)

    Caro, R.; Francisco, J. L. de

    1961-01-01

    This report presents the theoretical study of some physical aspects previous to the design of an exponential facility. The are: Fast and slow flux distribution in the multiplicative medium and in the thermal column, slowing down in the thermal column, geometrical distribution and minimum needed intensity of sources access channels and perturbations produced by possible variations in its position and intensity. (Author) 4 refs

  11. RC Circuits: Some Computer-Interfaced Experiments.

    Science.gov (United States)

    Jolly, Pratibha; Verma, Mallika

    1994-01-01

    Describes a simple computer-interface experiment for recording the response of an RC network to an arbitrary input excitation. The setup is used to pose a variety of open-ended investigations in network modeling by varying the initial conditions, input signal waveform, and the circuit topology. (DDR)

  12. Incorporating lab experience into computer security courses

    NARCIS (Netherlands)

    Ben Othmane, L.; Bhuse, V.; Lilien, L.T.

    2013-01-01

    We describe our experience with teaching computer security labs at two different universities. We report on the hardware and software lab setups, summarize lab assignments, present the challenges encountered, and discuss the lessons learned. We agree with and emphasize the viewpoint that security

  13. Volunteer computing experience with ATLAS@Home

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00068610; The ATLAS collaboration; Bianchi, Riccardo-Maria; Cameron, David; Filipčič, Andrej; Lançon, Eric; Wu, Wenjing

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  14. Volunteer Computing Experience with ATLAS@Home

    CERN Document Server

    Cameron, David; The ATLAS collaboration; Bourdarios, Claire; Lan\\c con, Eric

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers' resources make up a sizable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one job to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  15. Volunteer Computing Experience with ATLAS@Home

    Science.gov (United States)

    Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.; ATLAS Collaboration

    2017-10-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  16. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  17. A Survey of Patients' Preoperative Need for Information About Postoperative Pain-Effect of Previous Surgery Experience.

    Science.gov (United States)

    Mavridou, Paraskevi; Manataki, Adamantia; Arnaoutoglou, Elena; Damigos, Dimitrios

    2017-10-01

    The aim of this study was to determine the kind of information patients need preoperatively about postoperative pain (POP) and whether this is affected by previous surgery experience. A descriptive study design using preoperative questionnaires. Questionnaires with fixed questions related to POP and its management were distributed preoperatively to consenting, consecutive surgical patients. Patients were divided into two groups: patients with previous surgery experience (group A) and patients without previous surgery experience (group B). Of the patients who participated in the study, 94.2% wanted information about POP and 77.8% of them believe that they will feel calmer if they get the information they need. The patients' biggest concern relates to pain management issues after discharge. Next, in order of preference is information about the analgesics that they need to take. The patients want to be informed primarily with a personal interview (59.4%). Previous surgery experience has no effect on patients' needs for information. Most of the patients want to be informed about the management of the POP after being discharged. It is remarkable that patients who had previous surgery experience need the same information with those who had no previous surgery. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  18. Computational Experiments for Science and Engineering Education

    Science.gov (United States)

    Xie, Charles

    2011-01-01

    How to integrate simulation-based engineering and science (SBES) into the science curriculum smoothly is a challenging question. For the importance of SBES to be appreciated, the core value of simulations-that they help people understand natural phenomena and solve engineering problems-must be taught. A strategy to achieve this goal is to introduce computational experiments to the science curriculum to replace or supplement textbook illustrations and exercises and to complement or frame hands-on or wet lab experiments. In this way, students will have an opportunity to learn about SBES without compromising other learning goals required by the standards and teachers will welcome these tools as they strengthen what they are already teaching. This paper demonstrates this idea using a number of examples in physics, chemistry, and engineering. These exemplary computational experiments show that it is possible to create a curriculum that is both deeper and wider.

  19. The influence of previous subject experience on interactions during peer instruction in an introductory physics course: A mixed methods analysis

    Science.gov (United States)

    Vondruska, Judy A.

    Over the past decade, peer instruction and the introduction of student response systems has provided a means of improving student engagement and achievement in large-lecture settings. While the nature of the student discourse occurring during peer instruction is less understood, existing studies have shown student ideas about the subject, extraneous cues, and confidence level appear to matter in the student-student discourse. Using a mixed methods research design, this study examined the influence of previous subject experience on peer instruction in an introductory, one-semester Survey of Physics course. Quantitative results indicated students in discussion pairs where both had previous subject experience were more likely to answer clicker question correctly both before and after peer discussion compared to student groups where neither partner had previous subject experience. Students in mixed discussion pairs were not statistically different in correct response rates from the other pairings. There was no statistically significant difference between the experience pairs on unit exam scores or the Peer Instruction Partner Survey. Although there was a statistically significant difference between the pre-MPEX and post-MPEX scores, there was no difference between the members of the various subject experience peer discussion pairs. The qualitative study, conducted after the quantitative study, helped to inform the quantitative results by exploring the nature of the peer interactions through survey questions and a series of focus groups discussions. While the majority of participants described a benefit to the use of clickers in the lecture, their experience with their discussion partners varied. Students with previous subject experience tended to describe peer instruction more positively than students who did not have previous subject experience, regardless of the experience level of their partner. They were also more likely to report favorable levels of comfort with

  20. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    Science.gov (United States)

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  1. Bevacizumab plus chemotherapy in elderly patients with previously untreated metastatic colorectal cancer: single center experience

    International Nuclear Information System (INIS)

    Ocvirk, Janja; Moltara, Maja Ebert; Mesti, Tanja; Boc, Marko; Rebersek, Martina; Volk, Neva; Benedik, Jernej; Hlebanja, Zvezdana

    2016-01-01

    Metastatic colorectal cancer (mCRC) is mainly a disease of elderly, however, geriatric population is underrepresented in clinical trials. Patient registries represent a tool to assess and follow treatment outcomes in this patient population. The aim of the study was with the help of the patients’ register to determine the safety and efficacy of bevacizumab plus chemotherapy in elderly patients who had previously untreated metastatic colorectal cancer. The registry of patients with mCRC was designed to prospectively evaluate the safety and efficacy of bevacizumab-containing chemotherapy as well as selection of patients in routine clinical practice. Patient baseline clinical characteristics, pre-specified bevacizumab-related adverse events, and efficacy data were collected, evaluated and compared according to the age categories. Between January 2008 and December 2010, 210 patients with mCRC (median age 63, male 61.4%) started bevacizumab-containing therapy in the 1 st line setting. Majority of the 210 patients received irinotecan-based chemotherapy (68%) as 1 st line treatment and 105 patients (50%) received bevacizumab maintenance therapy. Elderly (≥ 70 years) patients presented 22.9% of all patients and they had worse performance status (PS 1/2, 62.4%) than patients in < 70 years group (PS 1/2, 35.8%). Difference in disease control rate was mainly due to inability to assess response in elderly group (64.6% in elderly and 77.8% in < 70 years group, p = 0.066). The median progression free survival was 10.2 (95% CI, 6.7–16.2) and 11.3 (95% CI, 10.2–12.6) months in elderly and < 70 years group, respectively (p = 0.58). The median overall survival was 18.5 (95% CI, 12.4–28.9) and 27.4 (95% CI, 22.7–31.9) months for elderly and < 70 years group, respectively (p = 0.03). Three-year survival rate was 26% and 37.6% in elderly vs. < 70 years group (p = 0.03). Overall rates of bevacizumab-related adverse events were similar in both groups: proteinuria 21

  2. Reliability and smallest worthwhile difference in 1RM tests according to previous resistance training experience in young women

    Directory of Open Access Journals (Sweden)

    Matheus Amarante do Nascimento

    2017-10-01

    Full Text Available The objective of this study was to determine the familiarization and smallest worthwhile difference (SWD of one-repetition maximum (1RM tests in detrained women according to their previous resistance training experience. Three groups of women with varying amounts of previous resistance training experience were recruited: Novice (n = 27, 1 to 6 months, Intermediate (n = 13, from 7 to 12 months, and Advanced (n = 20, 13 to 24 months. All participants performed four 1RM test sessions in the bench press (BP, squat (SQ, and arm curl (AC. A significant (p< 0.05 (group vs. time interaction was observed in SQ suggesting that more experienced participants needed fewer 1RM test sessions to reach a stable load compared to the less experienced groups. Strength changes (p 0.05, suggesting that experience had no impact on familiarization for these lifts. SWDs suggest that strength gains greater than 2-4% in these lifts would indicate a meaningful improvement in strength beyond random variation from trial to trial no matter the experience of the subject. Women with limited previous resistance training experience do not require more trials to reach load stabilization than those with more experience. Stability of 1RM loads for BP and AC may require only two sessions, while SQ may require at least three trials.

  3. Previous International Experience, Cross-Cultural Training, and Expatriates' Cross-Cultural Adjustment: Effects of Cultural Intelligence and Goal Orientation

    Science.gov (United States)

    Koo Moon, Hyoung; Kwon Choi, Byoung; Shik Jung, Jae

    2012-01-01

    Although various antecedents of expatriates' cross-cultural adjustment have been addressed, previous international experience, predeparture cross-cultural training, and cultural intelligence (CQ) have been most frequently examined. However, there are few attempts that explore the effects of these antecedents simultaneously or consider the possible…

  4. The Cat Is out of the Bag: The Joint Influence of Previous Experience and Looking Behavior on Infant Categorization

    Science.gov (United States)

    Kovack-Lesh, Kristine A.; Horst, Jessica S.; Oakes, Lisa M.

    2008-01-01

    We examined the effect of 4-month-old infants' previous experience with dogs, cats, or both and their online looking behavior on their learning of the adult-defined category of "cat" in a visual familiarization task. Four-month-old infants' (N = 123) learning in the laboratory was jointly determined by whether or not they had experience…

  5. Performing quantum computing experiments in the cloud

    Science.gov (United States)

    Devitt, Simon J.

    2016-09-01

    Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.

  6. The Effect of Previous Co-Worker Experience on the Survival of Knowledge Intensive Start-Ups

    DEFF Research Database (Denmark)

    Timmermans, Bram

    The aim of the paper is to investigate the effect of previous co-worker experience on the survival of knowledge intensive start-ups. For the empirical analysis I use the Danish Integrated Database of Labor Market Research (IDA). This longitudinal employer-employee database allows me to identify co-worker...... experience among all members of the firm. In addition, I will make a distinction between ordinary start-ups and entrepreneurial spin-offs. The results show that previous co-worker experience has a positive effect on new firm survival. This effect appears to be valid predominantly for ordinary start-ups than...

  7. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  8. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  9. Previous experiences and emotional baggage as barriers to lifestyle change - a qualitative study of Norwegian Healthy Life Centre participants.

    Science.gov (United States)

    Følling, Ingrid S; Solbjør, Marit; Helvik, Anne-S

    2015-06-23

    Changing lifestyle is challenging and difficult. The Norwegian Directorate of Health recommends that all municipalities establish Healthy Life Centres targeted to people with lifestyle issues. Little is known about the background, experiences and reflections of participants. More information is needed about participants to shape effective lifestyle interventions with lasting effect. This study explores how participants in a lifestyle intervention programme describe previous life experiences in relation to changing lifestyle. Semi-structured qualitative in-depth interviews were performed with 23 participants (16 women and 7 men) aged 18 - 70 years. The data were analysed using systematic text condensation searching for issues describing participants' responses, and looking for the essence, aiming to share the basis of life-world experiences as valid knowledge. Participants identified two main themes: being stuck in old habits, and being burdened with emotional baggage from their previous negative experiences. Participants expressed a wish to change their lifestyles, but were unable to act in accordance with the health knowledge they possessed. Previous experiences with lifestyle change kept them from initiating attempts without professional assistance. Participants also described being burdened by an emotional baggage with problems from childhood and/or with family, work and social life issues. Respondents said that they felt that emotional baggage was an important explanation for why they were stuck in old habits and that conversely, being stuck in old habits added load to their already emotional baggage and made it heavier. Behavioural change can be hard to perform as psychological distress from life baggage can influence the ability to change. The study participants' experience of being stuck in old habits and having substantial emotional baggage raises questions as to whether or not Healthy Life Centres are able to help participants who need to make a lifestyle

  10. Distributed computing grid experiences in CMS

    CERN Document Server

    Andreeva, Julia; Barrass, T; Bonacorsi, D; Bunn, Julian; Capiluppi, P; Corvo, M; Darmenov, N; De Filippis, N; Donno, F; Donvito, G; Eulisse, G; Fanfani, A; Fanzago, F; Filine, A; Grandi, C; Hernández, J M; Innocente, V; Jan, A; Lacaprara, S; Legrand, I; Metson, S; Newbold, D; Newman, H; Pierro, A; Silvestris, L; Steenberg, C; Stockinger, H; Taylor, Lucas; Thomas, M; Tuura, L; Van Lingen, F; Wildish, Tony

    2005-01-01

    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data- taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure ...

  11. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  12. [Effect of previous experience in reacting to a danger signal on "open field" behavior in the rat].

    Science.gov (United States)

    Poltyreva, T E; Petrov, E S

    1983-01-01

    Modification of rats behaviour in an "hopen field" test was investigated, induced by an acoustic stimulus, previously subjected to conditioning in a shuttle chamber in experiments with possibility and impossibility of avoidance from electrical shock. It has been established that presentation of a stimulus having the meaning of a danger signal, in a new situation, significantly suppresses investigating behaviour of rats, whereas the stimulus which had not been subjected to conditioning exerts no marked effect on behaviour. The greatest suppression was observed in rats with "learned helplessness". This fact suggests that the degree of suppression of the behaviour in an open field in response to a danger signal, depends on the animal's previous experience in reacting to this signal.

  13. Amorphous nanoparticles — Experiments and computer simulations

    International Nuclear Information System (INIS)

    Hoang, Vo Van; Ganguli, Dibyendu

    2012-01-01

    The data obtained by both experiments and computer simulations concerning the amorphous nanoparticles for decades including methods of synthesis, characterization, structural properties, atomic mechanism of a glass formation in nanoparticles, crystallization of the amorphous nanoparticles, physico-chemical properties (i.e. catalytic, optical, thermodynamic, magnetic, bioactivity and other properties) and various applications in science and technology have been reviewed. Amorphous nanoparticles coated with different surfactants are also reviewed as an extension in this direction. Much attention is paid to the pressure-induced polyamorphism of the amorphous nanoparticles or amorphization of the nanocrystalline counterparts. We also introduce here nanocomposites and nanofluids containing amorphous nanoparticles. Overall, amorphous nanoparticles exhibit a disordered structure different from that of corresponding bulks or from that of the nanocrystalline counterparts. Therefore, amorphous nanoparticles can have unique physico-chemical properties differed from those of the crystalline counterparts leading to their potential applications in science and technology.

  14. Computer controls for the WITCH experiment

    CERN Document Server

    Tandecki, M; Van Gorp, S; Friedag, P; De Leebeeck, V; Beck, D; Brand, H; Weinheimer, C; Breitenfeldt, M; Traykov, E; Mader, J; Roccia, S; Severijns, N; Herlert, A; Wauters, F; Zakoucky, D; Kozlov, V; Soti, G

    2011-01-01

    The WITCH experiment is a medium-scale experimental set-up located at ISOLDE/CERN. It combines a double Penning trap system with,a retardation spectrometer for energy measurements of recoil ions from beta decay. For a correct operation of such a set-up a whole range of different devices is required. Along with the installation and optimization of the set-up a computer control system was developed to control these devices. The CS-Framework that is developed and maintained at GSI, was chosen as a basis for this control system as it is perfectly suited to handle the distributed nature of a control system.We report here on the required hardware for WITCH, along with the basis of this CS-Framework and the add-ons that were implemented for WITCH. (C) 2010 Elsevier B.V. All rights reserved.

  15. Relationship between premature loss of primary teeth with oral hygiene, consumption of soft drinks, dental care, and previous caries experience.

    Science.gov (United States)

    López-Gómez, Sandra Aremy; Villalobos-Rodelo, Juan José; Ávila-Burgos, Leticia; Casanova-Rosado, Juan Fernando; Vallejos-Sánchez, Ana Alicia; Lucas-Rincón, Salvador Eduardo; Patiño-Marín, Nuria; Medina-Solís, Carlo Eduardo

    2016-02-26

    We determine the relationship between premature loss of primary teeth and oral hygiene, consumption of soft drinks, dental care and previous caries experience. This study focused on 833 Mexican schoolchildren aged 6-7. We performed an oral examination to determine caries experience and the simplified oral hygiene index. The dependent variable was the prevalence of at least one missing tooth (or indicated for extraction) of the primary dentition; this variable was coded as 0 = no loss of teeth and 1 = at least one lost primary tooth. The prevalence of at least one missing tooth was 24.7% (n = 206) (95% CI = 21.8-27.7). The variables that were associated with the prevalence of tooth loss (p oral hygiene (OR = 3.24), a lower frequency of brushing (OR = 1.60), an increased consumption of soda (OR = 1.89) and use of dental care (curative: OR = 2.83, preventive: OR = 1.93). This study suggests that the premature loss of teeth in the primary dentition is associated with oral hygiene, consumption of soft drinks, dental care and previous caries experience in Mexican schoolchildren. These data provide relevant information for the design of preventive dentistry programs.

  16. Intention to breastfeed in low-income pregnant women: the role of social support and previous experience.

    Science.gov (United States)

    Humphreys, A S; Thompson, N J; Miner, K R

    1998-09-01

    The purpose of this study was to describe the relationship between breastfeeding intention among socioeconomically disadvantaged pregnant women and maternal demographics, previous breastfeeding experience, and social support. A cross-sectional, convenience sampling strategy was employed for data collection. Low-income women (n = 1001) in a public hospital completed a six-page questionnaire about their infant feeding plans, demographics, and social support. Simple regression analyses were conducted to compare maternal breastfeeding intention with the hypothesized correlates. Breastfeeding intention was positively correlated with older maternal age, higher education, more breastfeeding experience, Hispanic ethnicity, and hearing about breastfeeding benefits from family members, the baby's father, and lactation consultants, but not from other health professionals. Health professionals' attitudes were less influential on women's infant feeding decisions than the attitudes and beliefs of members of women's social support networks. When controlling for breastfeeding experience (none vs any), some findings, varied, indicating a need for breastfeeding interventions tailored to women's level of experience. Use of peer counselors and lactation consultants, inclusion of a woman's family members in breastfeeding educational contacts, and creation of breastfeeding classes tailored to influential members of women's social support networks may improve breastfeeding rates among low-income women, especially those with no breastfeeding experience, more effectively than breastfeeding education to pregnant women that is solely conducted by health professionals.

  17. Does Previous Experience of Floods Stimulate the Adoption of Coping Strategies? Evidence from Cross Sectional Surveys in Nigeria and Tanzania

    Directory of Open Access Journals (Sweden)

    Sheila A. Boamah

    2015-11-01

    Full Text Available In sub-Saharan Africa, hydro-meteorological related disasters, such as floods, account for the majority of the total number of natural disasters. Over the past century, floods have affected 38 million people, claimed several lives and caused substantial economic losses in the region. The goal of this paper is to examine how personality disposition, social network, and socio-demographic factors mitigate the complex relationship between stressful life experiences of floods and ocean surges and the adoption of coping strategies among coastal communities in Nigeria and Tanzania. Generalized linear models (GLM were fitted to cross-sectional survey data on 1003 and 1253 individuals in three contiguous coastal areas in Nigeria and Tanzania, respectively. Marked differences in the type of coping strategies were observed across the two countries. In Tanzania, the zero-order relationships between adoption of coping strategies and age, employment and income disappeared at the multivariate level. Only experience of floods in the past year and social network resources were significant predictors of participants’ adoption of coping strategies, unlike in Nigeria, where a plethora of factors such as experience of ocean surges in the past one year, personality disposition, age, education, experience of flood in the past one year, ethnicity, income, housing quality and employment status were still statistically significant at the multivariate level. Our findings suggest that influence of previous experience on adoption of coping strategies is spatially ubiquitous. Consequently, context-specific policies aimed at encouraging the adoption of flood-related coping strategies in vulnerable locations should be designed based on local needs and orientation.

  18. What Is the Correct Answer about The Dress' Colors? Investigating the Relation between Optimism, Previous Experience, and Answerability.

    Science.gov (United States)

    Karlsson, Bodil S A; Allwood, Carl Martin

    2016-01-01

    The Dress photograph, first displayed on the internet in 2015, revealed stunning individual differences in color perception. The aim of this study was to investigate if lay-persons believed that the question about The Dress colors was answerable. Past research has found that optimism is related to judgments of how answerable knowledge questions with controversial answers are (Karlsson et al., 2016). Furthermore, familiarity with a question can create a feeling of knowing the answer (Reder and Ritter, 1992). Building on these findings, 186 participants saw the photo of The Dress and were asked about the correct answer to the question about The Dress' colors (" blue and black," "white and gold," "other, namely…," or "there is no correct answer" ). Choice of the alternative "there is no correct answer" was interpreted as believing the question was not answerable. This answer was chosen more often by optimists and by people who reported they had not seen The Dress before. We also found that among participants who had seen The Dress photo before, 19%, perceived The Dress as "white and gold" but believed that the correct answer was "blue and black ." This, in analogy to previous findings about non-believed memories (Scoboria and Pascal, 2016), shows that people sometimes do not believe the colors they have perceived are correct. Our results suggest that individual differences related to optimism and previous experience may contribute to if the judgment of the individual perception of a photograph is enough to serve as a decision basis for valid conclusions about colors. Further research about color judgments under ambiguous circumstances could benefit from separating individual perceptual experience from beliefs about the correct answer to the color question. Including the option "there is no correct answer " may also be beneficial.

  19. The Impact of Previous Action on Bargaining—An Experiment on the Emergence of Preferences for Fairness Norms

    Directory of Open Access Journals (Sweden)

    Thomas Neumann

    2017-08-01

    Full Text Available The communication of participants to identify an acceptable bargaining outcome in the Nash bargaining game is all about fairness norms. Participants introduce fairness norms which yield a better outcome for themselves in order to convince the other participant of their bargaining proposal. Typically, these fairness norms are in line with theoretical predictions, which support a wide variety of different but fair outcomes the participants can choose from. In this experiment, we play two treatments of the Nash bargaining game: in one treatment, the participants play a dictator game prior to bargaining, and in the other treatment they do not. We find that participants who have not played the dictator game intensively discuss the outcome of the game and come to solutions closer to the equal split of the pie the longer they chat. This effect vanishes as soon as the participants have previous experience from a dictator game: instead of chatting, they establish the fairness norm introduced in the dictator game. Remarkably, if the dictator is unfair in the dictator game, he also gets a higher share of the pie in the Nash bargaining game.

  20. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  1. Influence of Previous Crop on Durum Wheat Yield and Yield Stability in a Long-term Experiment

    Directory of Open Access Journals (Sweden)

    Anna Maria Stellacci

    2011-02-01

    Full Text Available Long-term experiments are leading indicators of sustainability and serve as an early warning system to detect problems that may compromise future productivity. So the stability of yield is an important parameter to be considered when judging the value of a cropping system relative to others. In a long-term rotation experiment set up in 1972 the influence of different crop sequences on the yields and on yield stability of durum wheat (Triticum durum Desf. was studied. The complete field experiment is a split-split plot in a randomized complete block design with two replications; the whole experiment considers three crop sequences: 1 three-year crop rotation: sugar-beet, wheat + catch crop, wheat; 2 one-year crop rotation: wheat + catch crop; 3 wheat continuous crop; the split treatments are two different crop residue managements; the split-split plot treatments are 18 different fertilization formulas. Each phase of every crop rotation occurred every year. In this paper only one crop residue management and only one fertilization treatment have been analized. Wheat crops in different rotations are coded as follows: F1: wheat after sugar-beet in three-year crop rotation; F2: wheat after wheat in three-year crop rotation; Fc+i: wheat in wheat + catch crop rotation; Fc: continuous wheat. The following two variables were analysed: grain yield and hectolitre weight. Repeated measures analyses of variance and stability analyses have been perfomed for the two variables. The stability analysis was conducted using: three variance methods, namely the coefficient of variability of Francis and Kannenberg, the ecovalence index of Wricke and the stability variance index of Shukla; the regression method of Eberhart and Russell; a method, proposed by Piepho, that computes the probability of one system outperforming another system. It has turned out that each of the stability methods used has enriched of information the simple variance analysis. The Piepho

  2. A Questionnaire Study on the Attitudes and Previous Experience of Croatian Family Physicians toward their Preparedness for Disaster Management.

    Science.gov (United States)

    Pekez-Pavliško, Tanja; Račić, Maja; Jurišić, Dinka

    2018-04-01

    To explore family physicians' attitudes, previous experience and self-assessed preparedness to respond or to assist in mass casualty incidents in Croatia. The cross-sectional survey was carried out during January 2017. Study participants were recruited through a Facebook group that brings together family physicians from Croatia. They were asked to complete the questionnaire, which was distributed via google.docs. Knowledge and attitudes toward disaster preparedness were evaluated by 18 questions. Analysis of variance, Student t test and Kruskal-Wallis test t were used for statistical analysis. Risk awareness of disasters was high among respondents (M = 4.89, SD=0.450). Only 16.4 of respondents have participated in the management of disaster at the scene. The majority (73.8%) of physicians have not been participating in any educational activity dealing with disaster over the past two years. Family physicians believed they are not well prepared to participate in national (M = 3.02, SD=0.856) and local community emergency response system for disaster (M = 3.16, SD=1.119). Male physicians scored higher preparedness to participate in national emergency response system for disaster ( p =0.012), to carry out accepted triage principles used in the disaster situation ( p =0.003) and recognize differences in health assessments indicating potential exposure to specific agents ( p =0,001) compared to their female colleagues. Croatian primary healthcare system attracts many young physicians, who can be an important part of disaster and emergency management. However, the lack of experience despite a high motivation indicates a need for inclusion of disaster medicine training during undergraduate studies and annual educational activities.

  3. Using Computer Games for Instruction: The Student Experience

    Science.gov (United States)

    Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David; Tomes, Russell

    2011-01-01

    Computer games are fun, exciting and motivational when used as leisure pursuits. But do they have similar attributes when utilized for educational purposes? This article investigates whether learning by computer game can improve student experiences compared with a more formal lecture approach and whether computer games have potential for improving…

  4. One Head Start Classroom's Experience: Computers and Young Children's Development.

    Science.gov (United States)

    Fischer, Melissa Anne; Gillespie, Catherine Wilson

    2003-01-01

    Contends that early childhood educators need to understand how exposure to computers and constructive computer programs affects the development of children. Specifically examines: (1) research on children's technology experiences; (2) determining best practices; and (3) addressing educators' concerns about computers replacing other developmentally…

  5. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  6. The Affective Experience of Novice Computer Programmers

    Science.gov (United States)

    Bosch, Nigel; D'Mello, Sidney

    2017-01-01

    Novice students (N = 99) participated in a lab study in which they learned the fundamentals of computer programming in Python using a self-paced computerized learning environment involving a 25-min scaffolded learning phase and a 10-min unscaffolded fadeout phase. Students provided affect judgments at approximately 100 points (every 15 s) over the…

  7. Electromagnetic Induction: A Computer-Assisted Experiment

    Science.gov (United States)

    Fredrickson, J. E.; Moreland, L.

    1972-01-01

    By using minimal equipment it is possible to demonstrate Faraday's Law. An electronic desk calculator enables sophomore students to solve a difficult mathematical expression for the induced EMF. Polaroid pictures of the plot of induced EMF, together with the computer facility, enables students to make comparisons. (PS)

  8. Computing in support of experiments at LAMPF

    International Nuclear Information System (INIS)

    Thomas, R.F.; Amann, J.F.; Butler, H.S.

    1976-10-01

    This report documents the discussions and conclusions of a study, conducted in August 1976, of the requirements for computer support of the experimental program in medium-energy physics at the Clinton P. Anderson Meson Physics Facility. 1 figure, 1 table

  9. Experiment Dashboard for Monitoring of the LHC Distributed Computing Systems

    International Nuclear Information System (INIS)

    Andreeva, J; Campos, M Devesas; Cros, J Tarragon; Gaidioz, B; Karavakis, E; Kokoszkiewicz, L; Lanciotti, E; Maier, G; Ollivier, W; Nowotka, M; Rocha, R; Sadykov, T; Saiz, P; Sargsyan, L; Sidorova, I; Tuckett, D

    2011-01-01

    LHC experiments are currently taking collisions data. A distributed computing model chosen by the four main LHC experiments allows physicists to benefit from resources spread all over the world. The distributed model and the scale of LHC computing activities increase the level of complexity of middleware, and also the chances of possible failures or inefficiencies in involved components. In order to ensure the required performance and functionality of the LHC computing system, monitoring the status of the distributed sites and services as well as monitoring LHC computing activities are among the key factors. Over the last years, the Experiment Dashboard team has been working on a number of applications that facilitate the monitoring of different activities: including following up jobs, transfers, and also site and service availabilities. This presentation describes Experiment Dashboard applications used by the LHC experiments and experience gained during the first months of data taking.

  10. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  11. Using sobol sequences for planning computer experiments

    Science.gov (United States)

    Statnikov, I. N.; Firsov, G. I.

    2017-12-01

    Discusses the use for research of problems of multicriteria synthesis of dynamic systems method of Planning LP-search (PLP-search), which not only allows on the basis of the simulation model experiments to revise the parameter space within specified ranges of their change, but also through special randomized nature of the planning of these experiments is to apply a quantitative statistical evaluation of influence of change of varied parameters and their pairwise combinations to analyze properties of the dynamic system.Start your abstract here...

  12. Model and Computing Experiment for Research and Aerosols Usage Management

    Directory of Open Access Journals (Sweden)

    Daler K. Sharipov

    2012-09-01

    Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.

  13. Computing for Lattice QCD: new developments from the APE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R [INFN, Sezione di Roma Tor Vergata, Roma (Italy); Biagioni, A; De Luca, S [INFN, Sezione di Roma, Roma (Italy)

    2008-06-15

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  14. Computing for Lattice QCD: new developments from the APE experiment

    International Nuclear Information System (INIS)

    Ammendola, R.; Biagioni, A.; De Luca, S.

    2008-01-01

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  15. A Computational Experiment on Single-Walled Carbon Nanotubes

    Science.gov (United States)

    Simpson, Scott; Lonie, David C.; Chen, Jiechen; Zurek, Eva

    2013-01-01

    A computational experiment that investigates single-walled carbon nanotubes (SWNTs) has been developed and employed in an upper-level undergraduate physical chemistry laboratory course. Computations were carried out to determine the electronic structure, radial breathing modes, and the influence of the nanotube's diameter on the…

  16. Previous Gardening Experience and Gardening Enjoyment Is Related to Vegetable Preferences and Consumption Among Low-Income Elementary School Children.

    Science.gov (United States)

    Evans, Alexandra; Ranjit, Nalini; Fair, Cori N; Jennings, Rose; Warren, Judith L

    2016-10-01

    To examine if gardening experience and enjoyment are associated with vegetable exposure, preferences, and consumption of vegetables among low-income third-grade children. Cross-sectional study design, using baseline data from the Texas! Grow! Eat! Go! Twenty-eight Title I elementary schools located in different counties in Texas. Third-grade students (n = 1,326, 42% Hispanic) MAIN OUTCOME MEASURES: Gardening experience, gardening enjoyment, vegetable exposure, preference, and consumption. Random-effects regression models, adjusted for age, sex, ethnicity, and body mass index percentile of child, estimated means and standard errors of vegetable consumption, exposure, and preference by levels of gardening experience and enjoyment. Wald χ 2 tests evaluated the significance of differences in means of outcomes across levels of gardening experience and enjoyment. Children with more gardening experience had greater vegetable exposure and higher vegetable preference and consumed more vegetables compared with children who reported less gardening experience. Those who reported that they enjoyed gardening had the highest levels of vegetable exposure, preference, and consumption. Garden-based interventions can have an important and positive effect on children's vegetable consumption by increasing exposure to fun gardening experiences. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  17. Homomorphic encryption experiments on IBM's cloud quantum computing platform

    Science.gov (United States)

    Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-02-01

    Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.

  18. The Information Science Experiment System - The computer for science experiments in space

    Science.gov (United States)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  19. An experiment for determining the Euler load by direct computation

    Science.gov (United States)

    Thurston, Gaylen A.; Stein, Peter A.

    1986-01-01

    A direct algorithm is presented for computing the Euler load of a column from experimental data. The method is based on exact inextensional theory for imperfect columns, which predicts two distinct deflected shapes at loads near the Euler load. The bending stiffness of the column appears in the expression for the Euler load along with the column length, therefore the experimental data allows a direct computation of bending stiffness. Experiments on graphite-epoxy columns of rectangular cross-section are reported in the paper. The bending stiffness of each composite column computed from experiment is compared with predictions from laminated plate theory.

  20. Methodological Potential of Computer Experiment in Teaching Mathematics at University

    Science.gov (United States)

    Lin, Kequan; Sokolova, Anna Nikolaevna; Vlasova, Vera K.

    2017-01-01

    The study is relevant due to the opportunity of increasing efficiency of teaching mathematics at university through integration of students of computer experiment conducted with the use of IT in this process. The problem of there search is defined by a contradiction between great potential opportunities of mathematics experiment for motivating and…

  1. Remote Viewing and Computer Communications--An Experiment.

    Science.gov (United States)

    Vallee, Jacques

    1988-01-01

    A series of remote viewing experiments were run with 12 participants who communicated through a computer conferencing network. The correct target sample was identified in 8 out of 33 cases. This represented more than double the pure chance expectation. Appendices present protocol, instructions, and results of the experiments. (Author/YP)

  2. Computer simulation of Wheeler's delayed-choice experiment with photons

    NARCIS (Netherlands)

    Zhao, S.; Yuan, S.; De Raedt, H.; Michielsen, K.

    We present a computer simulation model of Wheeler's delayed-choice experiment that is a one-to-one copy of an experiment reported recently (Jacques V. et al., Science, 315 (2007) 966). The model is solely based on experimental facts, satisfies Einstein's criterion of local causality and does not

  3. Brookhaven Reactor Experiment Control Facility, a distributed function computer network

    International Nuclear Information System (INIS)

    Dimmler, D.G.; Greenlaw, N.; Kelley, M.A.; Potter, D.W.; Rankowitz, S.; Stubblefield, F.W.

    1975-11-01

    A computer network for real-time data acquisition, monitoring and control of a series of experiments at the Brookhaven High Flux Beam Reactor has been developed and has been set into routine operation. This reactor experiment control facility presently services nine neutron spectrometers and one x-ray diffractometer. Several additional experiment connections are in progress. The architecture of the facility is based on a distributed function network concept. A statement of implementation and results is presented

  4. Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments

    Science.gov (United States)

    Vezer, M. A.

    2010-12-01

    Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between

  5. Locative media and data-driven computing experiments

    Directory of Open Access Journals (Sweden)

    Sung-Yueh Perng

    2016-06-01

    Full Text Available Over the past two decades urban social life has undergone a rapid and pervasive geocoding, becoming mediated, augmented and anticipated by location-sensitive technologies and services that generate and utilise big, personal, locative data. The production of these data has prompted the development of exploratory data-driven computing experiments that seek to find ways to extract value and insight from them. These projects often start from the data, rather than from a question or theory, and try to imagine and identify their potential utility. In this paper, we explore the desires and mechanics of data-driven computing experiments. We demonstrate how both locative media data and computing experiments are ‘staged’ to create new values and computing techniques, which in turn are used to try and derive possible futures that are ridden with unintended consequences. We argue that using computing experiments to imagine potential urban futures produces effects that often have little to do with creating new urban practices. Instead, these experiments promote Big Data science and the prospect that data produced for one purpose can be recast for another and act as alternative mechanisms of envisioning urban futures.

  6. Computer-Aided Experiment Planning toward Causal Discovery in Neuroscience.

    Science.gov (United States)

    Matiasz, Nicholas J; Wood, Justin; Wang, Wei; Silva, Alcino J; Hsu, William

    2017-01-01

    Computers help neuroscientists to analyze experimental results by automating the application of statistics; however, computer-aided experiment planning is far less common, due to a lack of similar quantitative formalisms for systematically assessing evidence and uncertainty. While ontologies and other Semantic Web resources help neuroscientists to assimilate required domain knowledge, experiment planning requires not only ontological but also epistemological (e.g., methodological) information regarding how knowledge was obtained. Here, we outline how epistemological principles and graphical representations of causality can be used to formalize experiment planning toward causal discovery. We outline two complementary approaches to experiment planning: one that quantifies evidence per the principles of convergence and consistency, and another that quantifies uncertainty using logical representations of constraints on causal structure. These approaches operationalize experiment planning as the search for an experiment that either maximizes evidence or minimizes uncertainty. Despite work in laboratory automation, humans must still plan experiments and will likely continue to do so for some time. There is thus a great need for experiment-planning frameworks that are not only amenable to machine computation but also useful as aids in human reasoning.

  7. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    Science.gov (United States)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  8. Cogema experience on retrieving and conditioning solid radwaste previously stored in pits. The La Hague north west pit case

    International Nuclear Information System (INIS)

    Bodin, F.; Alexandre, D.; Fournier, Ph.

    2000-01-01

    Short lived, low and medium level waste called 'technological waste' produced by the La Hague Reprocessing Plant have been stored in the La Hague North-West concrete-lined pits until implementation at ANDRA's Centre de Stockage de la Manche (CSM). COGEMA decided to retrieve and condition 11,000 m 3 of humid solid radwaste, stored in bulk in pits. This report describes the experience gained from February 1990 to December 1998, taking into account radwaste and integrated dose rate results conditioning such waste. The procedures and means used and improved by COGEMA to comply with ANDRA's storage standards and the ever-decreasing financial costs generated by the workers, allowed to retrieve and condition 11,000 m 3 of old solid radwaste with competitive costs and in complete safety and protection of the environment. (authors)

  9. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  10. PCNL - a comparative study in nonoperated and in previously operated (open nephrolithotomy/pyelolithotomy patients - a single-surgeon experience

    Directory of Open Access Journals (Sweden)

    Rahul Gupta

    2011-12-01

    Full Text Available PURPOSE: Re-procedure in patients with history of open stone surgery is usually challenging due to the alteration in the retroperitoneal anatomy. The aim of this study was to determine the possible impact of open renal surgery on the efficacy and morbidity of subsequent percutaneous nephrolithotomy (PCNL. MATERIALS AND METHODS: From March 2009 until September 2010, 120 patients underwent PCNL. Of these, 20 patients were excluded (tubeless or bilateral simultaneous PCNL. Of the remaining 100, 55 primary patients were categorized as Group 1 and the remaining (previous open nephrolithotomy as Group 2. Standard preoperative evaluation was carried out prior to intervention, Statistical analysis was performed using SPSS v. 11 with the chi-square test, independent samples t-test, and Mann-Whitney U test. A p-value < 0.05 was taken as statistically significant. RESULTS: Both groups were similar in demographic profile and stone burden. Attempts to access the PCS was less in Group 1 compared to Group 2 (1.2 + 1 2 vs 3 + 1.3 respectively and this was statistically significant (p < 0.04. However, the mean operative time between the two groups was not statistically significant (p = 0.44. Blood transfusion rate was comparable in the two groups (p = 0.24. One patient in Group 2 developed hemothorax following a supra-11th puncture. Remaining complications were comparable in both groups. CONCLUSION: Patients with past history of renal stone surgery may need more attempts to access the pelvicaliceal system and have difficulty in tract dilation secondary to retroperitoneal scarring. But overall morbidity and efficacy is same in both groups.

  11. COGEMA experience on retrieving and conditioning solid radwaste previously stored in pits. The La Hague North-West pit case

    International Nuclear Information System (INIS)

    Bodin, F.; Alexandre, D.; Fournier, P.

    1999-01-01

    Short lived, low and medium level waste called 'technological waste' produced by the La Hague Reprocessing Plant have been stored in the La Hague North-West concrete-lined pits until implementation at ANDRA's Centre de Stockage de la Manche (CSM). COGEMA decided to retrieve and condition 11,000 m 3 of humid solid radwaste, stored in bulk in pits. On account of the variety of radwaste kinds, retrieving and conditioning operations represented real challenge. One goal of these operations was to ensure that the work was performed in complete safety towards environment with optimum containment and with the best radiation protection for the personnel involved. COGEMA decided to split the work into two phases. The feedback from the first phase was very helpful to the second phase. This report describes the experience gained from February 1990 to December 1998, taking into account radwaste and integrated dose rate results conditioning such waste. The procedures and means used and improved by COGEMA to comply with ANDRA's storage standards and the ever-decreasing financial costs generated by the workers, allowed to retrieve and condition 11,000 m 3 of old solid radwaste with competitive costs and in complete safety and protection of the environment. (author)

  12. The effect of farrowing environment and previous experience on the maternal behaviour of sows in indoor pens and outdoor huts.

    Science.gov (United States)

    Wülbers-Mindermann, M; Berg, C; Illmann, G; Baulain, U; Algers, B

    2015-04-01

    Outdoor farrowing huts facilitate a less restricted maternal behaviour in sows compared with sows kept indoors in farrowing pens. The aim of our study was to investigate whether there are behavioural differences between primiparous sows kept outdoors in farrowing huts and indoors in pens, and whether the maternal behaviour during the second parity, when all sows were kept outdoors in farrowing huts, would differ between sows that have experienced the indoor or the outdoor environment, respectively, during their first parturition. A total of 26 Yorkshire×Swedish Landrace sows were studied. Of these, 11 sows were housed outdoors in farrowing huts during both parturitions (group=OUTOUT). The other 15 sows were kept indoors in a barn with single farrowing pens during their first parturition. During their second parturition, sows were kept outdoors in farrowing huts (group=INOUT). The behaviour was video recorded from 2 h prepartum to 48 h postpartum. The sows' responsiveness to playbacks of a piglet's screams was tested on days 2 to 3 postpartum. Parity 1: during the last 2 h prepartum, OUTOUT sows had a higher proportion of observations in the sternal lying position (Pbehavioural differences between INOUT and OUTOUT sows. In conclusion, it is not problematic for a second parity sow with initial maternal experience from an indoor farrowing pen to be kept outdoors in farrowing huts during its following farrowing.

  13. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  14. The Design and Evaluation of Teaching Experiments in Computer Science.

    Science.gov (United States)

    Forcheri, Paola; Molfino, Maria Teresa

    1992-01-01

    Describes a relational model that was developed to provide a framework for the design and evaluation of teaching experiments for the introduction of computer science in secondary schools in Italy. Teacher training is discussed, instructional materials are considered, and use of the model for the evaluation process is described. (eight references)…

  15. Instructional Styles, Attitudes and Experiences of Seniors in Computer Workshops

    Science.gov (United States)

    Wood, Eileen; Lanuza, Catherine; Baciu, Iuliana; MacKenzie, Meagan; Nosko, Amanda

    2010-01-01

    Sixty-four seniors were introduced to computers through a series of five weekly workshops. Participants were given instruction followed by hands-on experience for topics related to social communication, information seeking, games, and word processing and were observed to determine their preferences for instructional support. Observations of…

  16. The Experiment Method for Manufacturing Grid Development on Single Computer

    Institute of Scientific and Technical Information of China (English)

    XIAO Youan; ZHOU Zude

    2006-01-01

    In this paper, an experiment method for the Manufacturing Grid application system development in the single personal computer environment is proposed. The characteristic of the proposed method is constructing a full prototype Manufacturing Grid application system which is hosted on a single personal computer with the virtual machine technology. Firstly, it builds all the Manufacturing Grid physical resource nodes on an abstraction layer of a single personal computer with the virtual machine technology. Secondly, all the virtual Manufacturing Grid resource nodes will be connected with virtual network and the application software will be deployed on each Manufacturing Grid nodes. Then, we can obtain a prototype Manufacturing Grid application system which is working in the single personal computer, and can carry on the experiment on this foundation. Compared with the known experiment methods for the Manufacturing Grid application system development, the proposed method has the advantages of the known methods, such as cost inexpensively, operation simple, and can get the confidence experiment result easily. The Manufacturing Grid application system constructed with the proposed method has the high scalability, stability and reliability. It is can be migrated to the real application environment rapidly.

  17. Doctors' experience with handheld computers in clinical practice: qualitative study.

    Science.gov (United States)

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-05-15

    To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.

  18. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  19. IL4 gene polymorphism and previous malaria experiences manipulate anti-Plasmodium falciparum antibody isotype profiles in complicated and uncomplicated malaria

    Directory of Open Access Journals (Sweden)

    Kalambaheti Thareerat

    2009-12-01

    Full Text Available Abstract Background The IL4-590 gene polymorphism has been shown to be associated with elevated levels of anti-Plasmodium falciparum IgG antibodies and parasite intensity in the malaria protected Fulani of West Africa. This study aimed to investigate the possible impact of IL4-590C/T polymorphism on anti-P. falciparum IgG subclasses and IgE antibodies levels and the alteration of malaria severity in complicated and uncomplicated malaria patients with or without previous malaria experiences. Methods Anti-P.falciparum IgG subclasses and IgE antibodies in plasma of complicated and uncomplicated malaria patients with or without previous malaria experiences were analysed using ELISA. IL4-590 polymorphisms were genotyped using RFLP-PCR. Statistical analyses of the IgG subclass levels were done by Oneway ANOVA. Genotype differences were tested by Chi-squared test. Results The IL4-590T allele was significantly associated with anti-P. falciparum IgG3 antibody levels in patients with complicated (P = 0.031, but not with uncomplicated malaria (P = 0.622. Complicated malaria patients with previous malaria experiences carrying IL4-590TT genotype had significantly lower levels of anti-P. falciparum IgG3 (P = 0.0156, while uncomplicated malaria patients with previous malaria experiences carrying the same genotype had significantly higher levels (P = 0.0206 compared to their IL4-590 counterparts. The different anti-P. falciparum IgG1 and IgG3 levels among IL4 genotypes were observed. Complicated malaria patients with previous malaria experiences tended to have lower IgG3 levels in individuals carrying TT when compared to CT genotypes (P = 0.075. In contrast, complicated malaria patients without previous malaria experiences carrying CC genotype had significantly higher anti-P. falciparum IgG1 than those carrying either CT or TT genotypes (P = 0.004, P = 0.002, respectively. Conclusion The results suggest that IL4-590C or T alleles participated differently in the

  20. Framework for emotional mobile computation for creating entertainment experience

    Science.gov (United States)

    Lugmayr, Artur R.

    2007-02-01

    Ambient media are media, which are manifesting in the natural environment of the consumer. The perceivable borders between the media and the context, where the media is used are getting more and more blurred. The consumer is moving through a digital space of services throughout his daily life. As we are developing towards an experience society, the central point in the development of services is the creation of a consumer experience. This paper reviews possibilities and potentials of the creation of entertainment experiences with mobile phone platforms. It reviews sensor network capable of acquiring consumer behavior data, interactivity strategies, psychological models for emotional computation on mobile phones, and lays the foundations of a nomadic experience society. The paper rounds up with a presentation of several different possible service scenarios in the field of entertainment and leisure computation on mobiles. The goal of this paper is to present a framework and evaluation of possibilities of applying sensor technology on mobile platforms to create an increasing consumer entertainment experience.

  1. SAMGrid experiences with the Condor technology in Run II computing

    International Nuclear Information System (INIS)

    Baranovski, A.; Loebel-Carpenter, L.; Garzoglio, G.; Herber, R.; Illingworth, R.; Kennedy, R.; Kreymer, A.; Kumar, A.; Lueking, L.; Lyon, A.; Merritt, W.; Terekhov, I.; Trumbo, J.; Veseli, S.; White, S.; St. Denis, R.; Jain, S.; Nishandar, A.

    2004-01-01

    SAMGrid is a globally distributed system for data handling and job management, developed at Fermilab for the D0 and CDF experiments in Run II. The Condor system is being developed at the University of Wisconsin for management of distributed resources, computational and otherwise. We briefly review the SAMGrid architecture and its interaction with Condor, which was presented earlier. We then present our experiences using the system in production, which have two distinct aspects. At the global level, we deployed Condor-G, the Grid-extended Condor, for the resource brokering and global scheduling of our jobs. At the heart of the system is Condor's Matchmaking Service. As a more recent work at the computing element level, we have been benefiting from the large computing cluster at the University of Wisconsin campus. The architecture of the computing facility and the philosophy of Condor's resource management have prompted us to improve the application infrastructure for D0 and CDF, in aspects such as parting with the shared file system or reliance on resources being dedicated. As a result, we have increased productivity and made our applications more portable and Grid-ready. Our fruitful collaboration with the Condor team has been made possible by the Particle Physics Data Grid

  2. Multilink manipulator computer control: experience in development and commissioning

    International Nuclear Information System (INIS)

    Holt, J.E.

    1988-11-01

    This report describes development which has been carried out on the multilink manipulator computer control system. The system allows the manipulator to be driven using only two joysticks. The leading link is controlled and the other links follow its path into the reactor, thus avoiding any potential obstacles. The system has been fully commissioned and used with the Sizewell ''A'' reactor 2 Multilink T.V. manipulator. Experience of the use of the system is presented, together with recommendations for future improvements. (author)

  3. Unsteady Thick Airfoil Aerodynamics: Experiments, Computation, and Theory

    Science.gov (United States)

    Strangfeld, C.; Rumsey, C. L.; Mueller-Vahl, H.; Greenblatt, D.; Nayeri, C. N.; Paschereit, C. O.

    2015-01-01

    An experimental, computational and theoretical investigation was carried out to study the aerodynamic loads acting on a relatively thick NACA 0018 airfoil when subjected to pitching and surging, individually and synchronously. Both pre-stall and post-stall angles of attack were considered. Experiments were carried out in a dedicated unsteady wind tunnel, with large surge amplitudes, and airfoil loads were estimated by means of unsteady surface mounted pressure measurements. Theoretical predictions were based on Theodorsen's and Isaacs' results as well as on the relatively recent generalizations of van der Wall. Both two- and three-dimensional computations were performed on structured grids employing unsteady Reynolds-averaged Navier-Stokes (URANS). For pure surging at pre-stall angles of attack, the correspondence between experiments and theory was satisfactory; this served as a validation of Isaacs theory. Discrepancies were traced to dynamic trailing-edge separation, even at low angles of attack. Excellent correspondence was found between experiments and theory for airfoil pitching as well as combined pitching and surging; the latter appears to be the first clear validation of van der Wall's theoretical results. Although qualitatively similar to experiment at low angles of attack, two-dimensional URANS computations yielded notable errors in the unsteady load effects of pitching, surging and their synchronous combination. The main reason is believed to be that the URANS equations do not resolve wake vorticity (explicitly modeled in the theory) or the resulting rolled-up un- steady flow structures because high values of eddy viscosity tend to \\smear" the wake. At post-stall angles, three-dimensional computations illustrated the importance of modeling the tunnel side walls.

  4. Assessment of the Relationship between Recurrent High-risk Pregnancy and Mothers’ Previous Experience of Having an Infant Admitted to a Neonatal Intensive Care Unit

    Directory of Open Access Journals (Sweden)

    Sedigheh Hantoosh Zadeh

    2015-01-01

    Full Text Available Background & aim:  High-risk pregnancies increase the risk of Intensive Care Unit (ICU and Neonatal Intensive Care Unit (NICU admission in mothers and their newborns. In this study, we aimed to identify the association between the recurrence of high-risk pregnancy and mothers’ previous experience of having an infant admitted to NICU. Methods:We performed a cohort, retrospective study to compare subsequent pregnancy outcomes among 232 control subjects and 200 female cases with a previous experience of having a newborn requiring NICU admission due to intrauterine growth retardation, preeclampsia, preterm birth, premature rupture of membranes, and asphyxia. The information about the prevalence of subsequent high-risk pregnancies was gathered via phone calls. Results: As the results indicated, heparin, progesterone, and aspirin were more frequently administered in the case group during subsequent pregnancies, compared to the control group (P

  5. Expertik: Experience with Artificial Intelligence and Mobile Computing

    Directory of Open Access Journals (Sweden)

    José Edward Beltrán Lozano

    2013-06-01

    Full Text Available This article presents the experience in the development of services based in Artificial Intelligence, Service Oriented Architecture, mobile computing. It aims to combine technology offered by mobile computing provides techniques and artificial intelligence through a service provide diagnostic solutions to problems in industrial maintenance. It aims to combine technology offered by mobile computing and the techniques artificial intelligence through a service to provide diagnostic solutions to problems in industrial maintenance. For service creation are identified the elements of an expert system, the knowledge base, the inference engine and knowledge acquisition interfaces and their consultation. The applications were developed in ASP.NET under architecture three layers. The data layer was developed conjunction in SQL Server with data management classes; business layer in VB.NET and the presentation layer in ASP.NET with XHTML. Web interfaces for knowledge acquisition and query developed in Web and Mobile Web. The inference engine was conducted in web service developed for the fuzzy logic model to resolve requests from applications consulting knowledge (initially an exact rule-based logic within this experience to resolve requests from applications consulting knowledge. This experience seeks to strengthen a technology-based company to offer services based on AI for service companies Colombia.

  6. Experiments and computation of onshore breaking solitary waves

    DEFF Research Database (Denmark)

    Jensen, A.; Mayer, Stefan; Pedersen, G.K.

    2005-01-01

    This is a combined experimental and computational study of solitary waves that break on-shore. Velocities and accelerations are measured by a two-camera PIV technique and compared to theoretical values from an Euler model with a VOF method for the free surface. In particular, the dynamics of a so......-called collapsing breaker is scrutinized and the closure between the breaker and the beach is found to be akin to slamming. To the knowledge of the authors, no velocity measurements for this kind of breaker have been previously reported....

  7. On the computer simulation of the EPR-Bohm experiment

    International Nuclear Information System (INIS)

    McGoveran, D.O.; Noyes, H.P.; Manthey, M.J.

    1988-12-01

    We argue that supraluminal correlation without supraluminal signaling is a necessary consequence of any finite and discrete model for physics. Every day, the commercial and military practice of using encrypted communication based on correlated, pseudo-random signals illustrates this possibility. All that is needed are two levels of computational complexity which preclude using a smaller system to detect departures from ''randomness'' in the larger system. Hence the experimental realizations of the EPR-Bohm experiment leave open the question of whether the world of experience is ''random'' or pseudo-random. The latter possibility could be demonstrated experimentally if a complexity parameter related to the arm length and switching time in an Aspect-type realization of the EPR-Bohm experiment is sufficiently small compared to the number of reliable total counts which can be obtained in practice. 6 refs

  8. Topographic evolution of sandbars: Flume experiment and computational modeling

    Science.gov (United States)

    Kinzel, Paul J.; Nelson, Jonathan M.; McDonald, Richard R.; Logan, Brandy L.

    2010-01-01

    Measurements of sandbar formation and evolution were carried out in a laboratory flume and the topographic characteristics of these barforms were compared to predictions from a computational flow and sediment transport model with bed evolution. The flume experiment produced sandbars with approximate mode 2, whereas numerical simulations produced a bed morphology better approximated as alternate bars, mode 1. In addition, bar formation occurred more rapidly in the laboratory channel than for the model channel. This paper focuses on a steady-flow laboratory experiment without upstream sediment supply. Future experiments will examine the effects of unsteady flow and sediment supply and the use of numerical models to simulate the response of barform topography to these influences.

  9. Patient's anxiety and fear of anesthesia: effect of gender, age, education, and previous experience of anesthesia. A survey of 400 patients.

    Science.gov (United States)

    Mavridou, Paraskevi; Dimitriou, Varvara; Manataki, Adamantia; Arnaoutoglou, Elena; Papadopoulos, Georgios

    2013-02-01

    Patients express high anxiety preoperatively, because of fears related to anesthesia and its implications. The purpose of this survey was to gain insight into these fears and to study whether they are affected by patients' sex, age, education, or previous experience of anesthesia. Questionnaires with fixed questions were distributed to consenting, consecutive surgical patients before the pre-anesthetic visit. The questionnaires included patients' demographics and questions related to their fears about anesthesia. Four-hundred questionnaires were collected and analyzed. Eighty-one percent of patients experience preoperative anxiety. The main sources of their anxiety were fear of postoperative pain (84 %), of not waking up after surgery (64.8 %), of being nauseous or vomiting (60.2 %), and of drains and needles (59.5 %). Patients are less concerned about being paralyzed because of anesthesia (33.5 %) or of revealing personal issues (18.8 %). Gender seems to affect patients fears, with women being more afraid (85.3 vs. 75.6 % of men, p = 0.014). The effects of patients' age, level of education, and previous experience of anesthesia are minor, except for individual questions. Sixty-three percent of our patients (mostly women 67.4 vs. 57.4 % of men, p = 0.039) talk about these fears with their relatives, although a vast majority of 95.5 % would prefer to talk with the anesthesiologist and be reassured by him. All patients, mostly women, express fears about anesthesia; this fear leads to preoperative anxiety. Slight differences are observed for some individual questions among patients of different sex, education level, and previous experience of anesthesia.

  10. Distributing the computation in combinatorial optimization experiments over the cloud

    Directory of Open Access Journals (Sweden)

    Mario Brcic

    2017-12-01

    Full Text Available Combinatorial optimization is an area of great importance since many of the real-world problems have discrete parameters which are part of the objective function to be optimized. Development of combinatorial optimization algorithms is guided by the empirical study of the candidate ideas and their performance over a wide range of settings or scenarios to infer general conclusions. Number of scenarios can be overwhelming, especially when modeling uncertainty in some of the problem’s parameters. Since the process is also iterative and many ideas and hypotheses may be tested, execution time of each experiment has an important role in the efficiency and successfulness. Structure of such experiments allows for significant execution time improvement by distributing the computation. We focus on the cloud computing as a cost-efficient solution in these circumstances. In this paper we present a system for validating and comparing stochastic combinatorial optimization algorithms. The system also deals with selection of the optimal settings for computational nodes and number of nodes in terms of performance-cost tradeoff. We present applications of the system on a new class of project scheduling problem. We show that we can optimize the selection over cloud service providers as one of the settings and, according to the model, it resulted in a substantial cost-savings while meeting the deadline.

  11. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  12. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  13. Experience building and operating the CMS Tier-1 computing centres

    Science.gov (United States)

    Albert, M.; Bakken, J.; Bonacorsi, D.; Brew, C.; Charlot, C.; Huang, Chih-Hao; Colling, D.; Dumitrescu, C.; Fagan, D.; Fassi, F.; Fisk, I.; Flix, J.; Giacchetti, L.; Gomez-Ceballos, G.; Gowdy, S.; Grandi, C.; Gutsche, O.; Hahn, K.; Holzman, B.; Jackson, J.; Kreuzer, P.; Kuo, C. M.; Mason, D.; Pukhaeva, N.; Qin, G.; Quast, G.; Rossman, P.; Sartirana, A.; Scheurer, A.; Schott, G.; Shih, J.; Tader, P.; Thompson, R.; Tiradani, A.; Trunov, A.

    2010-04-01

    The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability to scale to large numbers of processing requests and large volumes of data, and the ability to provide custodial storage and high performance data serving. We will also present the operations experience utilizing the distributed Tier-1 centres from a distance: transferring data, submitting data serving requests, and submitting batch processing requests.

  14. Experience building and operating the CMS Tier-1 computing centres

    International Nuclear Information System (INIS)

    Albert, M; Bakken, J; Huang, Chih-Hao; Dumitrescu, C; Fagan, D; Fisk, I; Giacchetti, L; Gutsche, O; Holzman, B; Bonacorsi, D; Grandi, C; Brew, C; Jackson, J; Charlot, C; Colling, D; Fassi, F; Flix, J; Gomez-Ceballos, G; Hahn, K; Gowdy, S

    2010-01-01

    The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability to scale to large numbers of processing requests and large volumes of data, and the ability to provide custodial storage and high performance data serving. We will also present the operations experience utilizing the distributed Tier-1 centres from a distance: transferring data, submitting data serving requests, and submitting batch processing requests.

  15. The BaBar experiment's distributed computing model

    International Nuclear Information System (INIS)

    Boutigny, D.

    2001-01-01

    In order to face the expected increase in statistics between now and 2005, the BaBar experiment at SLAC is evolving its computing model toward a distributed multitier system. It is foreseen that data will be spread among Tier-A centers and deleted from the SLAC center. A uniform computing environment is being deployed in the centers, the network bandwidth is continuously increased and data distribution tools has been designed in order to reach a transfer rate of ∼100 TB of data per year. In parallel, smaller Tier-B and C sites receive subsets of data, presently in Kanga-ROOT format and later in Objectivity format. GRID tools will be used for remote job submission

  16. The BaBar Experiment's Distributed Computing Model

    International Nuclear Information System (INIS)

    Gowdy, Stephen J.

    2002-01-01

    In order to face the expected increase in statistics between now and 2005, the BaBar experiment at SLAC is evolving its computing model toward a distributed multi-tier system. It is foreseen that data will be spread among Tier-A centers and deleted from the SLAC center. A uniform computing environment is being deployed in the centers, the network bandwidth is continuously increased and data distribution tools has been designed in order to reach a transfer rate of ∼100 TB of data per year. In parallel, smaller Tier-B and C sites receive subsets of data, presently in Kanga-ROOT[1] format and later in Objectivity[2] format. GRID tools will be used for remote job submission

  17. Computer modeling of active experiments in space plasmas

    International Nuclear Information System (INIS)

    Bollens, R.J.

    1993-01-01

    The understanding of space plasmas is expanding rapidly. This is, in large part, due to the ambitious efforts of scientists from around the world who are performing large scale active experiments in the space plasma surrounding the earth. One such effort was designated the Active Magnetospheric Particle Tracer Explorers (AMPTE) and consisted of a series of plasma releases that were completed during 1984 and 1985. What makes the AMPTE experiments particularly interesting was the occurrence of a dramatic anomaly that was completely unpredicted. During the AMPTE experiment, three satellites traced the solar-wind flow into the earth's magnetosphere. One satellite, built by West Germany, released a series of barium and lithium canisters that were detonated and subsequently photo-ionized via solar radiation, thereby creating an artificial comet. Another satellite, built by Great Britain and in the vicinity during detonation, carried, as did the first satellite, a comprehensive set of magnetic field, particle and wave instruments. Upon detonation, what was observed by the satellites, as well as by aircraft and ground-based observers, was quite unexpected. The initial deflection of the ion clouds was not in the ambient solar wind's flow direction (rvec V) but rather in the direction transverse to the solar wind and the background magnetic field (rvec V x rvec B). This result was not predicted by any existing theories or simulation models; it is the main subject discussed in this dissertation. A large three dimensional computer simulation was produced to demonstrate that this transverse motion can be explained in terms of a rocket effect. Due to the extreme computer resources utilized in producing this work, the computer methods used to complete the calculation and the visualization techniques used to view the results are also discussed

  18. Explaining infant feeding: The role of previous personal and vicarious experience on attitudes, subjective norms, self-efficacy, and breastfeeding outcomes.

    Science.gov (United States)

    Bartle, Naomi C; Harvey, Kate

    2017-11-01

    Breastfeeding confers important health benefits to both infants and their mothers, but rates are low in the United Kingdom and other developed countries despite widespread promotion. This study examined the relationships between personal and vicarious experience of infant feeding, self-efficacy, the theory of planned behaviour variables of attitudes and subjective norm, and the likelihood of breastfeeding at 6-8 weeks post-natally. A prospective questionnaire study of both first-time mothers (n = 77) and experienced breastfeeders (n = 72) recruited at an antenatal clinic in South East England. Participants completed a questionnaire at 32 weeks pregnant assessing personal and vicarious experience of infant feeding (breastfeeding, formula-feeding, and maternal grandmother's experience of breastfeeding), perceived control, self-efficacy, intentions, attitudes (to breastfeeding and formula-feeding), and subjective norm. Infant feeding behaviour was recorded at 6-8 weeks post-natally. Multiple linear regression modelled the influence of vicarious experience on attitudes, subjective norm, and self-efficacy (but not perceived control) and modelled the influence of attitude, subjective norm, self-efficacy, and past experience on intentions to breastfeed. Logistic regression modelled the likelihood of breastfeeding at 6-8 weeks. Previous experience (particularly personal experience of breastfeeding) explained a significant amount of variance in attitudes, subjective norm, and self-efficacy. Intentions to breastfeed were predicted by subjective norm and attitude to formula-feeding and, in experienced mothers, self-efficacy. Breastfeeding at 6 weeks was predicted by intentions and vicarious experience of formula-feeding. Vicarious experience, particularly of formula-feeding, has been shown to influence the behaviour of first-time and experienced mothers both directly and indirectly via attitudes and subjective norm. Interventions that reduce exposure to formula

  19. Fisher information in the design of computer simulation experiments

    Energy Technology Data Exchange (ETDEWEB)

    StehlIk, Milan; Mueller, Werner G [Department of Applied Statistics, Johannes-Kepler-University Linz Freistaedter Strasse 315, A-4040 Linz (Austria)], E-mail: Milan.Stehlik@jku.at, E-mail: Werner.Mueller@jku.at

    2008-11-01

    The concept of Fisher information is conveniently used as a basis for designing efficient experiments. However, if the output stems from computer simulations they are often approximated as realizations of correlated random fields. Consequently, the conditions under which Fisher information may be suitable must be restated. In the paper we intend to give some simple but illuminating examples for these cases. 'Random phenomena have increasing importance in Engineering and Physics, therefore theoretical results are strongly needed. But there is a gap between the probability theory used by mathematicians and practitioners. Two very different languages have been generated in this way...' (Paul Kree, Paris 1995)

  20. Fisher information in the design of computer simulation experiments

    International Nuclear Information System (INIS)

    StehlIk, Milan; Mueller, Werner G

    2008-01-01

    The concept of Fisher information is conveniently used as a basis for designing efficient experiments. However, if the output stems from computer simulations they are often approximated as realizations of correlated random fields. Consequently, the conditions under which Fisher information may be suitable must be restated. In the paper we intend to give some simple but illuminating examples for these cases. 'Random phenomena have increasing importance in Engineering and Physics, therefore theoretical results are strongly needed. But there is a gap between the probability theory used by mathematicians and practitioners. Two very different languages have been generated in this way...' (Paul Kree, Paris 1995)

  1. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    Science.gov (United States)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  2. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    International Nuclear Information System (INIS)

    Grieger, Khara D.; Laurent, Alexis; Miseljic, Mirko; Christensen, Frans; Baun, Anders; Olsen, Stig I.

    2012-01-01

    While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance on how to practically apply these methods are still very much under development. This paper evaluates how research efforts have applied LCA and RA together for NM, particularly reflecting on previous experiences with applying these methods to chemicals. Through a literature review and a separate analysis of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key “lessons learned” from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches for using these methods together for NM: “LC-based RA” (traditional RA applied in a life-cycle perspective) and “RA-complemented LCA” (conventional LCA supplemented by RA in specific life-cycle steps). Hence, the latter is the only identified approach which genuinely combines LC- and RA-based methods for NM-risk research efforts to date as the former is rather a continuation of normal RA according to standard assessment procedures (e.g., REACH). Both these approaches along with recommendations for using LCA and RA together for NM are similar to those made previously for chemicals, and thus, there does not appear to be much progress made specific for NM. We have identified one issue in particular that may be specific for NM when applying LCA and RA at this time: the need to establish proper dose metrics within both methods.

  3. Tactile Radar: experimenting a computer game with visually disabled.

    Science.gov (United States)

    Kastrup, Virgínia; Cassinelli, Alvaro; Quérette, Paulo; Bergstrom, Niklas; Sampaio, Eliana

    2017-09-18

    Visually disabled people increasingly use computers in everyday life, thanks to novel assistive technologies better tailored to their cognitive functioning. Like sighted people, many are interested in computer games - videogames and audio-games. Tactile-games are beginning to emerge. The Tactile Radar is a device through which a visually disabled person is able to detect distal obstacles. In this study, it is connected to a computer running a tactile-game. The game consists in finding and collecting randomly arranged coins in a virtual room. The study was conducted with nine congenital blind people including both sexes, aged 20-64 years old. Complementary methods of first and third person were used: the debriefing interview and the quasi-experimental design. The results indicate that the Tactile Radar is suitable for the creation of computer games specifically tailored for visually disabled people. Furthermore, the device seems capable of eliciting a powerful immersive experience. Methodologically speaking, this research contributes to the consolidation and development of first and third person complementary methods, particularly useful in disabled people research field, including the evaluation by users of the Tactile Radar effectiveness in a virtual reality context. Implications for rehabilitation Despite the growing interest in virtual games for visually disabled people, they still find barriers to access such games. Through the development of assistive technologies such as the Tactile Radar, applied in virtual games, we can create new opportunities for leisure, socialization and education for visually disabled people. The results of our study indicate that the Tactile Radar is adapted to the creation of video games for visually disabled people, providing a playful interaction with the players.

  4. A benchmark on computational simulation of a CT fracture experiment

    International Nuclear Information System (INIS)

    Franco, C.; Brochard, J.; Ignaccolo, S.; Eripret, C.

    1992-01-01

    For a better understanding of the fracture behavior of cracked welds in piping, FRAMATOME, EDF and CEA have launched an important analytical research program. This program is mainly based on the analysis of the effects of the geometrical parameters (the crack size and the welded joint dimensions) and the yield strength ratio on the fracture behavior of several cracked configurations. Two approaches have been selected for the fracture analyses: on one hand, the global approach based on the concept of crack driving force J and on the other hand, a local approach of ductile fracture. In this approach the crack initiation and growth are modelized by the nucleation, growth and coalescence of cavities in front of the crack tip. The model selected in this study estimates only the growth of the cavities using the RICE and TRACEY relationship. The present study deals with a benchmark on computational simulation of CT fracture experiments using three computer codes : ALIBABA developed by EDF the CEA's code CASTEM 2000 and the FRAMATOME's code SYSTUS. The paper is split into three parts. At first, the authors present the experimental procedure for high temperature toughness testing of two CT specimens taken from a welded pipe, characteristic of pressurized water reactor primary piping. Secondly, considerations are outlined about the Finite Element analysis and the application procedure. A detailed description is given on boundary and loading conditions, on the mesh characteristics, on the numerical scheme involved and on the void growth computation. Finally, the comparisons between numerical and experimental results are presented up to the crack initiation, the tearing process being not taken into account in the present study. The variations of J and of the local variables used to estimate the damage around the crack tip (triaxiality and hydrostatic stresses, plastic deformations, void growth ...) are computed as a function of the increasing load

  5. Sexual behavior induction of c-Fos in the nucleus accumbens and amphetamine-stimulated locomotor activity are sensitized by previous sexual experience in female Syrian hamsters.

    Science.gov (United States)

    Bradley, K C; Meisel, R L

    2001-03-15

    Dopamine transmission in the nucleus accumbens can be activated by drugs, stress, or motivated behaviors, and repeated exposure to these stimuli can sensitize this dopamine response. The objectives of this study were to determine whether female sexual behavior activates nucleus accumbens neurons and whether past sexual experience cross-sensitizes neuronal responses in the nucleus accumbens to amphetamine. Using immunocytochemical labeling, c-Fos expression in different subregions (shell vs core at the rostral, middle, and caudal levels) of the nucleus accumbens was examined in female hamsters that had varying amounts of sexual experience. Female hamsters, given either 6 weeks of sexual experience or remaining sexually naive, were tested for sexual behavior by exposure to adult male hamsters. Previous sexual experience increased c-Fos labeling in the rostral and caudal levels but not in the middle levels of the nucleus accumbens. Testing for sexual behavior increased labeling in the core, but not the shell, of the nucleus accumbens. To validate that female sexual behavior can sensitize neurons in the mesolimbic dopamine pathway, the locomotor responses of sexually experienced and sexually naive females to an amphetamine injection were then compared. Amphetamine increased general locomotor activity in all females. However, sexually experienced animals responded sooner to amphetamine than did sexually naive animals. These data indicate that female sexual behavior can activate neurons in the nucleus accumbens and that sexual experience can cross-sensitize neuronal responses to amphetamine. In addition, these results provide additional evidence for functional differences between the shell and core of the nucleus accumbens and across its anteroposterior axis.

  6. Developments of multibody system dynamics: computer simulations and experiments

    International Nuclear Information System (INIS)

    Yoo, Wan-Suk; Kim, Kee-Nam; Kim, Hyun-Woo; Sohn, Jeong-Hyun

    2007-01-01

    It is an exceptional success when multibody dynamics researchers Multibody System Dynamics journal one of the most highly ranked journals in the last 10 years. In the inaugural issue, Professor Schiehlen wrote an interesting article explaining the roots and perspectives of multibody system dynamics. Professor Shabana also wrote an interesting article to review developments in flexible multibody dynamics. The application possibilities of multibody system dynamics have grown wider and deeper, with many application examples being introduced with multibody techniques in the past 10 years. In this paper, the development of multibody dynamics is briefly reviewed and several applications of multibody dynamics are described according to the author's research results. Simulation examples are compared to physical experiments, which show reasonableness and accuracy of the multibody formulation applied to real problems. Computer simulations using the absolute nodal coordinate formulation (ANCF) were also compared to physical experiments; therefore, the validity of ANCF for large-displacement and large-deformation problems was shown. Physical experiments for large deformation problems include beam, plate, chain, and strip. Other research topics currently being carried out in the author's laboratory are also briefly explained

  7. Computer-generated ovaries to assist follicle counting experiments.

    Directory of Open Access Journals (Sweden)

    Angelos Skodras

    Full Text Available Precise estimation of the number of follicles in ovaries is of key importance in the field of reproductive biology, both from a developmental point of view, where follicle numbers are determined at specific time points, as well as from a therapeutic perspective, determining the adverse effects of environmental toxins and cancer chemotherapeutics on the reproductive system. The two main factors affecting follicle number estimates are the sampling method and the variation in follicle numbers within animals of the same strain, due to biological variability. This study aims at assessing the effect of these two factors, when estimating ovarian follicle numbers of neonatal mice. We developed computer algorithms, which generate models of neonatal mouse ovaries (simulated ovaries, with characteristics derived from experimental measurements already available in the published literature. The simulated ovaries are used to reproduce in-silico counting experiments based on unbiased stereological techniques; the proposed approach provides the necessary number of ovaries and sampling frequency to be used in the experiments given a specific biological variability and a desirable degree of accuracy. The simulated ovary is a novel, versatile tool which can be used in the planning phase of experiments to estimate the expected number of animals and workload, ensuring appropriate statistical power of the resulting measurements. Moreover, the idea of the simulated ovary can be applied to other organs made up of large numbers of individual functional units.

  8. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Laurent, Alexis; Miseljic, Mirko

    2012-01-01

    of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key ‘‘lessons learned’’ from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches...... for using these methods together for NM: ‘‘LC-based RA’’ (traditional RA applied in a life-cycle perspective) and ‘‘RA-complemented LCA’’ (conventional LCA supplemented by RA in specific life-cycle steps). Hence, the latter is the only identified approach which genuinely combines LC- and RA-based methods......While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance...

  9. Experience with the WIMS computer code at Skoda Plzen

    International Nuclear Information System (INIS)

    Vacek, J.; Mikolas, P.

    1991-01-01

    Validation of the program for neutronics analysis is described. Computational results are compared with results of experiments on critical assemblies and with results of other codes for different types of lattices. Included are the results for lattices containing Gd as burnable absorber. With minor exceptions, the results of benchmarking were quite satisfactory and justified the inclusion of WIMS in the production system of codes for WWER analysis. The first practical application was the adjustment of the WWER-440 few-group diffusion constants library of the three-dimensional diffusion code MOBY-DICK, which led to a remarkable improvement of results for operational states. Then a new library for the analysis of WWER-440 start-up was generated and tested and at present a new library for the analysis of WWER-440 operational states is being tested. Preparation of the library for WWER-1000 is in progress. (author). 19 refs

  10. Test experience on an ultrareliable computer communication network

    Science.gov (United States)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  11. A Rural South African Experience of an ESL Computer Program

    Directory of Open Access Journals (Sweden)

    Marius Dieperink

    2008-12-01

    Full Text Available This article reports on a case study that explored the effect of an English-as-Second Language (ESL computer program at Tshwane University of Technology (TUT, South Africa. The case study explored participants’ perceptions, attitudes and beliefs regarding the ESL reading enhancement program, Reading Excellence™. The study found that participants experienced the program in a positive light. They experienced improved ESL reading as well as listening and writing proficiency. In addition, they experienced improved affective well-being in the sense that they generally felt more comfortable using ESL. This included feeling more self-confident in their experience of their academic environment. Interviews as well as document review resulted in dissonance, however: data pointed towards poor class attendance as well as a perturbing lack of progress in terms of reading comprehension and speed.

  12. Experiences using DAKOTA stochastic expansion methods in computational simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan; Ruthruff, Joseph R.

    2012-01-01

    Uncertainty quantification (UQ) methods bring rigorous statistical connections to the analysis of computational and experiment data, and provide a basis for probabilistically assessing margins associated with safety and reliability. The DAKOTA toolkit developed at Sandia National Laboratories implements a number of UQ methods, which are being increasingly adopted by modeling and simulation teams to facilitate these analyses. This report disseminates results as to the performance of DAKOTA's stochastic expansion methods for UQ on a representative application. Our results provide a number of insights that may be of interest to future users of these methods, including the behavior of the methods in estimating responses at varying probability levels, and the expansion levels for the methodologies that may be needed to achieve convergence.

  13. Studies on defect evolution in steels: experiments and computer simulations

    International Nuclear Information System (INIS)

    Sundar, C.S.

    2011-01-01

    In this paper, we present the results of our on-going studies on steels that are being carried out with a view to develop radiation resistant steels. The focus is on the use of nano-dispersoids in alloys towards the suppression of void formation and eventual swelling under irradiation. Results on the nucleation and growth of TiC precipitates in Ti modified austenitic steels and investigations on nano Yttria particles in Fe - a model oxide dispersion ferritic steel will be presented. The experimental methods of ion beam irradiation and positron annihilation spectroscopy have been used to elucidate the role of minor alloying elements on swelling behaviour. Computer simulation of defect processes have been carried out using ab-initio methods, molecular dynamics and Monte Carlo simulations. Our perspectives on addressing the multi-scale phenomena of defect processes leading to radiation damage, through a judicious combination of experiments and simulations, would be presented. (author)

  14. Alkali Rydberg states in electromagnetic fields: computational physics meets experiment

    International Nuclear Information System (INIS)

    Krug, A.

    2001-11-01

    We study highly excited hydrogen and alkali atoms ('Rydberg states') under the influence of a strong microwave field. As the external frequency is comparable to the highly excited electron's classical Kepler frequency, the external field induces a strong coupling of many different quantum mechanical energy levels and finally leads to the ionization of the outer electron. While periodically driven atomic hydrogen can be seen as a paradigm of quantum chaotic motion in an open (decaying) quantum system, the presence of the non-hydrogenic atomic core - which unavoidably has to be treated quantum mechanically - entails some complications. Indeed, laboratory experiments show clear differences in the ionization dynamics of microwave driven hydrogen and non-hydrogenic Rydberg states. In the first part of this thesis, a machinery is developed that allows for numerical experiments on alkali and hydrogen atoms under precisely identical laboratory conditions. Due to the high density of states in the parameter regime typically explored in laboratory experiments, such simulations are only possible with the most advanced parallel computing facilities, in combination with an efficient parallel implementation of the numerical approach. The second part of the thesis is devoted to the results of the numerical experiment. We identify and describe significant differences and surprising similarities in the ionization dynamics of atomic hydrogen as compared to alkali atoms, and give account of the relevant frequency scales that distinguish hydrogenic from non-hydrogenic ionization behavior. Our results necessitate a reinterpretation of the experimental results so far available, and solve the puzzle of a distinct ionization behavior of periodically driven hydrogen and non-hydrogenic Rydberg atoms - an unresolved question for about one decade. Finally, microwave-driven Rydberg states will be considered as prototypes of open, complex quantum systems that exhibit a complicated temporal decay

  15. Interdisciplinary Team-Teaching Experience for a Computer and Nuclear Energy Course for Electrical and Computer Engineering Students

    Science.gov (United States)

    Kim, Charles; Jackson, Deborah; Keiller, Peter

    2016-01-01

    A new, interdisciplinary, team-taught course has been designed to educate students in Electrical and Computer Engineering (ECE) so that they can respond to global and urgent issues concerning computer control systems in nuclear power plants. This paper discusses our experience and assessment of the interdisciplinary computer and nuclear energy…

  16. Is previous disaster experience a good predictor for disaster preparedness in extreme poverty households in remote Muslim minority based community in China?

    Science.gov (United States)

    Chan, Emily Y Y; Kim, Jean H; Lin, Cherry; Cheung, Eliza Y L; Lee, Polly P Y

    2014-06-01

    Disaster preparedness is an important preventive strategy for protecting health and mitigating adverse health effects of unforeseen disasters. A multi-site based ethnic minority project (2009-2015) is set up to examine health and disaster preparedness related issues in remote, rural, disaster prone communities in China. The primary objective of this reported study is to examine if previous disaster experience significantly increases household disaster preparedness levels in remote villages in China. A cross-sectional, household survey was conducted in January 2011 in Gansu Province, in a predominately Hui minority-based village. Factors related to disaster preparedness were explored using quantitative methods. Two focus groups were also conducted to provide additional contextual explanations to the quantitative findings of this study. The village household response rate was 62.4 % (n = 133). Although previous disaster exposure was significantly associated with perception of living in a high disaster risk area (OR = 6.16), only 10.7 % households possessed a disaster emergency kit. Of note, for households with members who had non-communicable diseases, 9.6 % had prepared extra medications to sustain clinical management of their chronic conditions. This is the first study that examined disaster preparedness in an ethnic minority population in remote communities in rural China. Our results indicate the need of disaster mitigation education to promote preparedness in remote, resource-poor communities.

  17. EXPERIMENTS AND COMPUTATIONAL MODELING OF PULVERIZED-COAL IGNITION; FINAL

    International Nuclear Information System (INIS)

    Samuel Owusu-Ofori; John C. Chen

    1999-01-01

    Under typical conditions of pulverized-coal combustion, which is characterized by fine particles heated at very high rates, there is currently a lack of certainty regarding the ignition mechanism of bituminous and lower rank coals as well as the ignition rate of reaction. furthermore, there have been no previous studies aimed at examining these factors under various experimental conditions, such as particle size, oxygen concentration, and heating rate. Finally, there is a need to improve current mathematical models of ignition to realistically and accurately depict the particle-to-particle variations that exist within a coal sample. Such a model is needed to extract useful reaction parameters from ignition studies, and to interpret ignition data in a more meaningful way. The authors propose to examine fundamental aspects of coal ignition through (1) experiments to determine the ignition temperature of various coals by direct measurement, and (2) modeling of the ignition process to derive rate constants and to provide a more insightful interpretation of data from ignition experiments. The authors propose to use a novel laser-based ignition experiment to achieve their first objective. Laser-ignition experiments offer the distinct advantage of easy optical access to the particles because of the absence of a furnace or radiating walls, and thus permit direct observation and particle temperature measurement. The ignition temperature of different coals under various experimental conditions can therefore be easily determined by direct measurement using two-color pyrometry. The ignition rate-constants, when the ignition occurs heterogeneously, and the particle heating rates will both be determined from analyses based on these measurements

  18. Caring for women wanting a vaginal birth after previous caesarean section: A qualitative study of the experiences of midwives and obstetricians.

    Science.gov (United States)

    Foureur, Maralyn; Turkmani, Sabera; Clack, Danielle C; Davis, Deborah L; Mollart, Lyndall; Leiser, Bernadette; Homer, Caroline S E

    2017-02-01

    One of the greatest contributors to the overall caesarean section rate is elective repeat caesarean section. Decisions around mode of birth are often complex for women and influenced by the views of the doctors and midwives who care for and counsel women. Women may be more likely to choose a repeat elective caesarean section (CS) if their health care providers lack skills and confidence in supporting vaginal birth after caesarean section (VBAC). To explore the views and experiences of providers in caring for women considering VBAC, in particular the decision-making processes and the communication of risk and safety to women. A descriptive interpretive method was utilised. Four focus groups with doctors and midwives were conducted. The central themes were: 'developing trust', 'navigating the system' and 'optimising support'. The impact of past professional experiences; the critical importance of continuity of carer and positive relationships; the ability to weigh up risks versus benefits; and the language used were all important elements. The role of policy and guidelines on providing standardised care for women who had a previous CS was also highlighted. Midwives and doctors in this study were positively oriented towards assisting and supporting women to attempt a VBAC. Care providers considered that women who have experienced a prior CS need access to midwifery continuity of care with a focus on support, information-sharing and effective communication. Copyright © 2016 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  19. Gravitational Acceleration Effects on Macrosegregation: Experiment and Computational Modeling

    Science.gov (United States)

    Leon-Torres, J.; Curreri, P. A.; Stefanescu, D. M.; Sen, S.

    1999-01-01

    Experiments were performed under terrestrial gravity (1g) and during parabolic flights (10-2 g) to study the solidification and macrosegregation patterns of Al-Cu alloys. Alloys having 2% and 5% Cu were solidified against a chill at two different cooling rates. Microscopic and Electron Microprobe characterization was used to produce microstructural and macrosegregation maps. In all cases positive segregation occurred next to the chill because shrinkage flow, as expected. This positive segregation was higher in the low-g samples, apparently because of the higher heat transfer coefficient. A 2-D computational model was used to explain the experimental results. The continuum formulation was employed to describe the macroscopic transports of mass, energy, and momentum, associated with the solidification phenomena, for a two-phase system. The model considers that liquid flow is driven by thermal and solutal buoyancy, and by solidification shrinkage. The solidification event was divided into two stages. In the first one, the liquid containing freely moving equiaxed grains was described through the relative viscosity concept. In the second stage, when a fixed dendritic network was formed after dendritic coherency, the mushy zone was treated as a porous medium. The macrosegregation maps and the cooling curves obtained during experiments were used for validation of the solidification and segregation model. The model can explain the solidification and macrosegregation patterns and the differences between low- and high-gravity results.

  20. Computationally mediated experiments: the next frontier in microscopy

    International Nuclear Information System (INIS)

    Zaluzec, N.J.

    2002-01-01

    Full text: It's reasonably safe to say that most of the simple experimental techniques that can be employed in microscopy have been well documented and exploited over the last 20 years. Thus, if we are interested in extending the range and diversity of problems that we will be dealing with in the next decade then we will have to takeup challenges which here-to-for were considered beyond the realm of routine work. Given the ever growing tendency to add computational resources to our instruments it is clear that the next breakthrough will be directly tied to how well we can effectively tie these two realms together. In the past we have used computers to simply speed up our experiments, but in the up coming decade the key will be to realize that once an effective interface of instrumentation and computational tools is developed we must change the way in which we design our experiments. This means re-examining how we do experiments so that measurements are done not just quickly, but precisely and to maximize the information measured so that the data therein can be 'mined' for content which might have been missed in the past. As example of this consider the experimental technique of Position Resolved Diffraction which is currently being developed for the study of nanoscale magnetic structures using ANL's Advanced Analytical Electron Microscope. Here a focused electron probe is sequentially scanned across a two dimensional field of view of a thin specimen and at each point on the specimen a two dimensional electron diffraction pattern is acquired and stored. Analysis of the spatial variation in the electron diffraction pattern allows a researcher to study the subtle changes resulting from microstructural differences such as ferro and electro magnetic domain formation and motion. There is, however, a severe limitation in this technique-namely its need to store and dynamically process large data sets preferably in near real time. A minimal scoping measurement would involve

  1. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    Science.gov (United States)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  2. Multi-fidelity Gaussian process regression for computer experiments

    International Nuclear Information System (INIS)

    Le-Gratiet, Loic

    2013-01-01

    This work is on Gaussian-process based approximation of a code which can be run at different levels of accuracy. The goal is to improve the predictions of a surrogate model of a complex computer code using fast approximations of it. A new formulation of a co-kriging based method has been proposed. In particular this formulation allows for fast implementation and for closed-form expressions for the predictive mean and variance for universal co-kriging in the multi-fidelity framework, which is a breakthrough as it really allows for the practical application of such a method in real cases. Furthermore, fast cross validation, sequential experimental design and sensitivity analysis methods have been extended to the multi-fidelity co-kriging framework. This thesis also deals with a conjecture about the dependence of the learning curve (i.e. the decay rate of the mean square error) with respect to the smoothness of the underlying function. A proof in a fairly general situation (which includes the classical models of Gaussian-process based meta-models with stationary covariance functions) has been obtained while the previous proofs hold only for degenerate kernels (i.e. when the process is in fact finite- dimensional). This result allows for addressing rigorously practical questions such as the optimal allocation of the budget between different levels of codes in the multi-fidelity framework. (author) [fr

  3. Interactive Quantum Mechanics Quantum Experiments on the Computer

    CERN Document Server

    Brandt, S; Dahmen, H.D

    2011-01-01

    Extra Materials available on extras.springer.com INTERACTIVE QUANTUM MECHANICS allows students to perform their own quantum-physics experiments on their computer, in vivid 3D color graphics. Topics covered include: •        harmonic waves and wave packets, •        free particles as well as bound states and scattering in various potentials in one and three dimensions (both stationary and time dependent), •        two-particle systems, coupled harmonic oscillators, •        distinguishable and indistinguishable particles, •        coherent and squeezed states in time-dependent motion, •        quantized angular momentum, •        spin and magnetic resonance, •        hybridization. For the present edition the physics scope has been widened appreciably. Moreover, INTERQUANTA can now produce user-defined movies of quantum-mechanical situations. Movies can be viewed directly and also be saved to be shown later in any browser. Sections on spec...

  4. Computer-simulated experiments and computer games: a method of design analysis

    Directory of Open Access Journals (Sweden)

    Jerome J. Leary

    1995-12-01

    Full Text Available Through the new modularization of the undergraduate science degree at the University of Brighton, larger numbers of students are choosing to take some science modules which include an amount of laboratory practical work. Indeed, within energy studies, the fuels and combustion module, for which the computer simulations were written, has seen a fourfold increase in student numbers from twelve to around fifty. Fitting out additional laboratories with new equipment to accommodate this increase presented problems: the laboratory space did not exist; fitting out the laboratories with new equipment would involve a relatively large capital spend per student for equipment that would be used infrequently; and, because some of the experiments use inflammable liquids and gases, additional staff would be needed for laboratory supervision.

  5. Computing for ongoing experiments on high energy physics in LPP, JINR

    International Nuclear Information System (INIS)

    Belosludtsev, D.A.; Zhil'tsov, V.E.; Zinchenko, A.I.; Kekelidze, V.D.; Madigozhin, D.T.; Potrebenikov, Yu.K.; Khabarov, S.V.; Shkarovskij, S.N.; Shchinov, B.G.

    2004-01-01

    The computer infrastructure made at the Laboratory of Particle Physics, JINR, purposed for active participation of JINR experts in ongoing experiments on particle and nuclear physics is presented. The principles of design and construction of the personal computer farm have been given and the used computer and informational services for effective application of distributed computer resources have been described

  6. On-Line Digital Computer Applications in Gas Chromatography, An Undergraduate Analytical Experiment

    Science.gov (United States)

    Perone, S. P.; Eagleston, J. F.

    1971-01-01

    Presented are some descriptive background materials and the directions for an experiment which provides an introduction to on-line computer instrumentation. Assumes students are familiar with the Purdue Real-Time Basic (PRTB) laboratory computer system. (PR)

  7. Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments.

    NARCIS (Netherlands)

    Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick

    2008-01-01

    Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.

  8. An Experiment Support Computer for Externally-Based ISS Payloads

    Science.gov (United States)

    Sell, S. W.; Chen, S. E.

    2002-01-01

    The Experiment Support Facility - External (ESF-X) is a computer designed for general experiment use aboard the International Space Station (ISS) Truss Site locations. The ESF-X design is highly modular and uses commercial off-the-shelf (COTS) components wherever possible to allow for maximum reconfigurability to meet the needs of almost any payload. The ESF-X design has been developed with the EXPRESS Pallet as the target location and the University of Colorado's Micron Accuracy Deployment Experiment (MADE) as the anticipated first payload and capability driver. Thus the design presented here is configured for structural dynamics and control as well as optics experiments. The ESF-X is a small (58.4 x 48.3 x 17.8") steel and copper enclosure which houses a 14 slot VME card chassis and power supply. All power and data connections are made through a single panel on the enclosure so that only one side of the enclosure must be accessed for nominal operation and servicing activities. This feature also allows convenient access during integration and checkout activities. Because it utilizes a standard VME backplane, ESF-X can make use of the many commercial boards already in production for this standard. Since the VME standard is also heavily used in industrial and military applications, many ruggedized components are readily available. The baseline design includes commercial processors, Ethernet, MIL-STD-1553, and mass storage devices. The main processor board contains four TI 6701 DSPs with a PowerPC based controller. Other standard functions, such as analog-to-digital, digital-to-analog, motor driver, temperature readings, etc., are handled on industry-standard IP modules. Carrier cards, which hold 4 IP modules each, are placed in slots in the VME backplane. A unique, custom IP carrier board with radiation event detectors allows non RAD-hard components to be used in an extended exposure environment. Thermal control is maintained by conductive cooling through the copper

  9. Experience of computed tomographic myelography and discography in cervical problem

    Energy Technology Data Exchange (ETDEWEB)

    Nakatani, Shigeru; Yamamoto, Masayuki; Uratsuji, Masaaki; Suzuki, Kunio; Matsui, Eigo [Hyogo Prefectural Awaji Hospital, Sumoto, Hyogo (Japan); Kurihara, Akira

    1983-06-01

    CTM (computed tomographic myelography) was performed on 15 cases of cervical lesions, and on 5 of them, CTD (computed tomographic discography) was also made. CTM revealed the intervertebral state, and in combination with CTD, providing more accurate information. The combined method of CTM and CTD was useful for soft disc herniation.

  10. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  11. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  12. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  13. Large scale statistics for computational verification of grain growth simulations with experiments

    International Nuclear Information System (INIS)

    Demirel, Melik C.; Kuprat, Andrew P.; George, Denise C.; Straub, G.K.; Misra, Amit; Alexander, Kathleen B.; Rollett, Anthony D.

    2002-01-01

    It is known that by controlling microstructural development, desirable properties of materials can be achieved. The main objective of our research is to understand and control interface dominated material properties, and finally, to verify experimental results with computer simulations. We have previously showed a strong similarity between small-scale grain growth experiments and anisotropic three-dimensional simulations obtained from the Electron Backscattered Diffraction (EBSD) measurements. Using the same technique, we obtained 5170-grain data from an Aluminum-film (120 (micro)m thick) with a columnar grain structure. Experimentally obtained starting microstructure and grain boundary properties are input for the three-dimensional grain growth simulation. In the computational model, minimization of the interface energy is the driving force for the grain boundary motion. The computed evolved microstructure is compared with the final experimental microstructure, after annealing at 550 C. Characterization of the structures and properties of grain boundary networks (GBN) to produce desirable microstructures is one of the fundamental problems in interface science. There is an ongoing research for the development of new experimental and analytical techniques in order to obtain and synthesize information related to GBN. The grain boundary energy and mobility data were characterized by Electron Backscattered Diffraction (EBSD) technique and Atomic Force Microscopy (AFM) observations (i.e., for ceramic MgO and for the metal Al). Grain boundary energies are extracted from triple junction (TJ) geometry considering the local equilibrium condition at TJ's. Relative boundary mobilities were also extracted from TJ's through a statistical/multiscale analysis. Additionally, there are recent theoretical developments of grain boundary evolution in microstructures. In this paper, a new technique for three-dimensional grain growth simulations was used to simulate interface migration

  14. Computer simulations of laser hot spots and implosion symmetry kiniform phase plate experiments on Nova

    International Nuclear Information System (INIS)

    Peterson, R. R.; Lindman, E. L.; Delamater, N. D.; Magelssen, G. R.

    2000-01-01

    LASNEX computer code simulations have been performed for radiation symmetry experiments on the Nova laser with vacuum and gas-filled hohlraum targets [R. L. Kauffman et al., Phys. Plasmas 5, 1927 (1998)]. In previous experiments with unsmoothed laser beams, the symmetry was substantially shifted by deflection of the laser beams. In these experiments, laser beams have been smoothed with Kiniform Phase Plates in an attempt to remove deflection of the beams. The experiments have shown that this smoothing significantly improves the agreement with LASNEX calculations of implosion symmetry. The images of laser produced hot spots on the inside of the hohlraum case have been found to differ from LASNEX calculations, suggesting that some beam deflection or self-focusing may still be present or that emission from interpenetrating plasmas is an important component of the images. The measured neutron yields are in good agreement with simulations for vacuum hohlraums but are far different for gas-filled hohlraums. (c) 2000 American Institute of Physics

  15. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  16. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  17. A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)

    Science.gov (United States)

    Rhew, Ray D.; Parker, Peter A.

    2007-01-01

    Design of Experiments (DOE) was applied to the LAS geometric parameter study to efficiently identify and rank primary contributors to integrated drag over the vehicles ascent trajectory in an order of magnitude fewer CFD configurations thereby reducing computational resources and solution time. SME s were able to gain a better understanding on the underlying flowphysics of different geometric parameter configurations through the identification of interaction effects. An interaction effect, which describes how the effect of one factor changes with respect to the levels of other factors, is often the key to product optimization. A DOE approach emphasizes a sequential approach to learning through successive experimentation to continuously build on previous knowledge. These studies represent a starting point for expanded experimental activities that will eventually cover the entire design space of the vehicle and flight trajectory.

  18. Assessing Pre-Service Teachers' Computer Phobia Levels in Terms of Gender and Experience, Turkish Sample

    Science.gov (United States)

    Ursavas, Omer Faruk; Karal, Hasan

    2009-01-01

    In this study it is aimed to determine the level of pre-service teachers' computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were…

  19. Educational Computer Use in Leisure Contexts: A Phenomenological Study of Adolescents' Experiences at Internet Cafes

    Science.gov (United States)

    Cilesiz, Sebnem

    2009-01-01

    Computer use is a widespread leisure activity for adolescents. Leisure contexts, such as Internet cafes, constitute specific social environments for computer use and may hold significant educational potential. This article reports a phenomenological study of adolescents' experiences of educational computer use at Internet cafes in Turkey. The…

  20. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  1. Comparing Computer Game and Traditional Lecture Using Experience Ratings from High and Low Achieving Students

    Science.gov (United States)

    Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David

    2012-01-01

    Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants…

  2. Perspectives on distributed computing : thirty people, four user types, and the distributed computing user experience.

    Energy Technology Data Exchange (ETDEWEB)

    Childers, L.; Liming, L.; Foster, I.; Mathematics and Computer Science; Univ. of Chicago

    2008-10-15

    This report summarizes the methodology and results of a user perspectives study conducted by the Community Driven Improvement of Globus Software (CDIGS) project. The purpose of the study was to document the work-related goals and challenges facing today's scientific technology users, to record their perspectives on Globus software and the distributed-computing ecosystem, and to provide recommendations to the Globus community based on the observations. Globus is a set of open source software components intended to provide a framework for collaborative computational science activities. Rather than attempting to characterize all users or potential users of Globus software, our strategy has been to speak in detail with a small group of individuals in the scientific community whose work appears to be the kind that could benefit from Globus software, learn as much as possible about their work goals and the challenges they face, and describe what we found. The result is a set of statements about specific individuals experiences. We do not claim that these are representative of a potential user community, but we do claim to have found commonalities and differences among the interviewees that may be reflected in the user community as a whole. We present these as a series of hypotheses that can be tested by subsequent studies, and we offer recommendations to Globus developers based on the assumption that these hypotheses are representative. Specifically, we conducted interviews with thirty technology users in the scientific community. We included both people who have used Globus software and those who have not. We made a point of including individuals who represent a variety of roles in scientific projects, for example, scientists, software developers, engineers, and infrastructure providers. The following material is included in this report: (1) A summary of the reported work-related goals, significant issues, and points of satisfaction with the use of Globus software

  3. Computations, Complexity, Experiments, and the World Outside Physics

    International Nuclear Information System (INIS)

    Kadanoff, L.P

    2009-01-01

    Computer Models in the Sciences and Social Sciences. 1. Simulation and Prediction in Complex Systems: the Good the Bad and the Awful. This lecture deals with the history of large-scale computer modeling mostly in the context of the U.S. Department of Energy's sponsorship of modeling for weapons development and innovation in energy sources. 2. Complexity: Making a Splash-Breaking a Neck - The Making of Complexity in Physical System. For ages thinkers have been asking how complexity arise. The laws of physics are very simple. How come we are so complex? This lecture tries to approach this question by asking how complexity arises in physical fluids. 3. Forrester, et. al. Social and Biological Model-Making The partial collapse of the world's economy has raised the question of whether we could improve the performance of economic and social systems by a major effort on creating understanding via large-scale computer models. (author)

  4. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    Science.gov (United States)

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  5. Status of the Grid Computing for the ALICE Experiment in the Czech Republic

    International Nuclear Information System (INIS)

    Adamova, D; Hampl, J; Chudoba, J; Kouba, T; Svec, J; Mendez, Lorenzo P; Saiz, P

    2010-01-01

    The Czech Republic (CR) has been participating in the LHC Computing Grid project (LCG) ever since 2003 and gradually, a middle-sized Tier-2 center has been built in Prague, delivering computing services for national HEP experiments groups including the ALICE project at the LHC. We present a brief overview of the computing activities and services being performed in the CR for the ALICE experiment.

  6. Computing Activities for the PANDA Experiment at FAIR

    NARCIS (Netherlands)

    Messchendorp, Johan; Gruntorad, J; Lokajicek, M

    2010-01-01

    The PANDA experiment at the future facility FAIR will provide valuable data for our present understanding of the strong interaction. In preparation for the experiments, large-scale simulations for design and feasibility studies are performed exploiting a new software framework, PandaROOT, which is

  7. Computer Simulation of Einstein-Podolsky-Rosen-Bohm Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.

    We review an event-based simulation approach which reproduces the statistical distributions of quantum physics experiments by generating detection events one-by-one according to an unknown distribution and without solving a wave equation. Einstein-Podolsky-Rosen-Bohm laboratory experiments are used

  8. Computer control and monitoring of neutral beam injectors on the 2XIIB CTR experiment at LLL

    International Nuclear Information System (INIS)

    Pollock, G.G.

    1975-01-01

    The original manual control system for the 12 neutral beam injectors on the 2XIIB Machine is being integrated with a computer control system. This, in turn, is a part of a multiple computer network comprised of the three computers which are involved in the operation and instrumentation of the 2XIIB experiment. The computer control system simplifies neutral beam operation and centralizes it to a single operating position. A special purpose console utilizes computer generated graphics and interactive function entry buttons to optimize the human/machine interface. Through the facilities of the computer network, a high level control function will be implemented for the use of the experimenter in a remotely located experiment diagnositcs area. In addition to controlling the injectors in normal operation, the computer system provides automatic conditioning of the injectors, bringing rebuilt units back to full energy output with minimum loss of useful life. The computer system also provides detail archive data recording

  9. Assessing the impact of previous experience, and attitudes towards technology, on levels of engagement in a virtual reality based occupational therapy intervention for spinal cord injury rehabilitation

    LENUS (Irish Health Repository)

    McCaughey, Manus Dr.

    2007-01-01

    The aim of the current research project was to determine if there were significant differences between patients with higher or lower levels of experience with technology in terms of their level of engagement with virtual reality (VR) in occupational therapy, their future uptake of VR technology in therapy, and their attitudes towards technology. Patients’ experience of technology was also examined in relation to demographic characteristics such as age and education level.\\r\

  10. Study of some physical aspects previous to design of an exponential experiment; Estudio de algunos aspectos fisicos previos al diseno de una experiencia exponencial

    Energy Technology Data Exchange (ETDEWEB)

    Caro, R; Francisco, J L. de

    1961-07-01

    This report presents the theoretical study of some physical aspects previous to the design of an exponential facility. The are: Fast and slow flux distribution in the multiplicative medium and in the thermal column, slowing down in the thermal column, geometrical distribution and minimum needed intensity of sources access channels and perturbations produced by possible variations in its position and intensity. (Author) 4 refs.

  11. Use of Intracervical Foley Catheter for Induction of Labour in Cases of Previous Caesarean Section; Experience of a single tertiary centre in Oman

    Directory of Open Access Journals (Sweden)

    Hazel Gonsalves

    2016-11-01

    Full Text Available Objectives: This study aimed to evaluate rates of success and perinatal complications of labour induction using an intracervical Foley catheter among women with a previous Caesarean delivery at a tertiary centre in Oman. Methods: This retrospective cohort study included 68 pregnant women with a history of a previous Caesarean section who were admitted for induction via Foley catheter between January 2011 and December 2013 to the Sultan Qaboos University Hospital, Muscat, Oman. Patient data were collected from electronic and delivery ward records. Results: Most women were 25–35 years old (76.5% and 20 women had had one previous vaginal delivery (29.4%. The most common indication for induction of labour was intrauterine growth restriction with oligohydramnios (27.9%. Most women delivered after 40 gestational weeks (48.5% and there were no neonatal admissions or complications. The majority experienced no complications during the induction period (85.3%, although a few had vaginal bleeding (5.9%, intrapartum fever (4.4%, rupture of the membranes (2.9% and cord prolapse shortly after insertion of the Foley catheter (1.5%. However, no cases of uterine rupture or scar dehiscence were noted. Overall, the success rate of vaginal birth after a previous Caesarean delivery was 69.1%, with the remaining patients undergoing an emergency Caesarean section (30.9%. Conclusion: The use of a Foley catheter in the induction of labour in women with a previous Caesarean delivery appears a safe option with a good success rate and few maternal and fetal complications.

  12. Analysis of Computer Experiments with Multiple Noise Sources

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    2010-01-01

    In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled...

  13. Power-Efficient Computing: Experiences from the COSA Project

    Directory of Open Access Journals (Sweden)

    Daniele Cesini

    2017-01-01

    Full Text Available Energy consumption is today one of the most relevant issues in operating HPC systems for scientific applications. The use of unconventional computing systems is therefore of great interest for several scientific communities looking for a better tradeoff between time-to-solution and energy-to-solution. In this context, the performance assessment of processors with a high ratio of performance per watt is necessary to understand how to realize energy-efficient computing systems for scientific applications, using this class of processors. Computing On SOC Architecture (COSA is a three-year project (2015–2017 funded by the Scientific Commission V of the Italian Institute for Nuclear Physics (INFN, which aims to investigate the performance and the total cost of ownership offered by computing systems based on commodity low-power Systems on Chip (SoCs and high energy-efficient systems based on GP-GPUs. In this work, we present the results of the project analyzing the performance of several scientific applications on several GPU- and SoC-based systems. We also describe the methodology we have used to measure energy performance and the tools we have implemented to monitor the power drained by applications while running.

  14. Trainee Teachers' e-Learning Experiences of Computer Play

    Science.gov (United States)

    Wright, Pam

    2009-01-01

    Pam Wright highlights the role of technology in providing situated learning opportunities for preservice teachers to explore the role commercial computer games may have in primary education. In a study designed to assess the effectiveness of an online unit on gaming incorporated into a course on learning technologies, Wright found that thoughtful…

  15. COMPUTER-AIDED DATA ACQUISITION FOR COMBUSTION EXPERIMENTS

    Science.gov (United States)

    The article describes the use of computer-aided data acquisition techniques to aid the research program of the Combustion Research Branch (CRB) of the U.S. EPA's Air and Energy Engineering Research Laboratory (AEERL) in Research Triangle Park, NC, in particular on CRB's bench-sca...

  16. Music Teachers' Experiences in One-to-One Computing Environments

    Science.gov (United States)

    Dorfman, Jay

    2016-01-01

    Ubiquitous computing scenarios such as the one-to-one model, in which every student is issued a device that is to be used across all subjects, have increased in popularity and have shown both positive and negative influences on education. Music teachers in schools that adopt one-to-one models may be inadequately equipped to integrate this kind of…

  17. Manganese Catalyzed Regioselective C–H Alkylation: Experiment and Computation

    KAUST Repository

    Wang, Chengming

    2018-05-08

    A new efficient manganese-catalyzed selective C2-alkylation of indoles via carbenoid insertion has been achieved. The newly developed C-H functionalization protocol provides access to diverse products and shows good functional group tolerance. Mechanistic and computational studies support the formation of a Mn(CO)3 acetate complex as the catalytically active species.

  18. Manganese Catalyzed Regioselective C–H Alkylation: Experiment and Computation

    KAUST Repository

    Wang, Chengming; Maity, Bholanath; Cavallo, Luigi; Rueping, Magnus

    2018-01-01

    A new efficient manganese-catalyzed selective C2-alkylation of indoles via carbenoid insertion has been achieved. The newly developed C-H functionalization protocol provides access to diverse products and shows good functional group tolerance. Mechanistic and computational studies support the formation of a Mn(CO)3 acetate complex as the catalytically active species.

  19. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  20. First experience with a mobile computed tomograph in the USSR

    International Nuclear Information System (INIS)

    Portnoj, L.M.

    1989-01-01

    Utilization experience of mobile computerized tomograph mounted in the bus is presented. Problems concerning staff, selection of medical base institutes etc are considered. Efficiency of mobile computerized tomographes in revealing different diseases is pointed out

  1. Experience of public procurement of Open Compute servers

    Science.gov (United States)

    Bärring, Olof; Guerri, Marco; Bonfillou, Eric; Valsan, Liviu; Grigore, Alexandru; Dore, Vincent; Gentit, Alain; Clement, Benoît; Grossir, Anthony

    2015-12-01

    The Open Compute Project. OCP (http://www.opencompute.org/). was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware. with the goal to develop servers and data centres following the model traditionally associated with open source software projects. In 2013 CERN acquired a few OCP servers in order to compare performance and power consumption with standard hardware. The conclusions were that there are sufficient savings to motivate an attempt to procure a large scale installation. One objective is to evaluate if the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. This paper summarizes this procurement. which started in September 2014 and involved the Request for information (RFI) to qualify bidders and Request for Tender (RFT).

  2. Ioversol 350: clinical experience in cranial computed tomography

    International Nuclear Information System (INIS)

    Theron, J.; Paugam, J.P.; Courtheoux, P.

    1991-01-01

    A single, open trial was conducted in 40 patients to evaluate the diagnostic efficacy and safety, in cranial computed tomography, of ioversol (350 mgl/ml), a new nonionic, monomeric, low-osmolality contrast medium. Ioversol is characterized by a hydrophilicity which is not only the highest of all nonionic agents available to date, but also evenly distributed among the various sides of the benzene ring. Diagnosis was possible in 100 % of cases with a mean degree of certainty of 90.8 %. Six minor adverse reactions requiring no treatment we recorded, of which two were observed by the investigator and four reported by the patients. No pain sensation was found and heat sensations were of minor intensity. Ioversol 350, which showed good diagnostic efficacy and proved to be well tolerated, is therefore suitable for cranial computed tomography at a mean dose of 1 ml/kg

  3. Assessing computer skills in Tanzanian medical students: an elective experience

    Directory of Open Access Journals (Sweden)

    Melvin Rob

    2004-08-01

    Full Text Available Abstract Background One estimate suggests that by 2010 more than 30% of a physician's time will be spent using information technology tools. The aim of this study is to assess the information and communication technologies (ICT skills of medical students in Tanzania. We also report a pilot intervention of peer mentoring training in ICT by medical students from the UK tutoring students in Tanzania. Methods Design: Cross sectional study and pilot intervention study. Participants: Fourth year medical students (n = 92 attending Muhimbili University College of Health Sciences, Dar es Salaam, Tanzania. Main outcome measures: Self-reported assessment of competence on ICT-related topics and ability to perform specific ICT tasks. Further information related to frequency of computer use (hours per week, years of computer use, reasons for use and access to computers. Skills at specific tasks were reassessed for 12 students following 4 to 6 hours of peer mentoring training. Results The highest levels of competence in generic ICT areas were for email, Internet and file management. For other skills such as word processing most respondents reported low levels of competence. The abilities to perform specific ICT skills were low – less than 60% of the participants were able to perform the core specific skills assessed. A period of approximately 5 hours of peer mentoring training produced an approximate doubling of competence scores for these skills. Conclusion Our study has found a low level of ability to use ICT facilities among medical students in a leading university in sub-Saharan Africa. A pilot scheme utilising UK elective students to tutor basic skills showed potential. Attention is required to develop interventions that can improve ICT skills, as well as computer access, in order to bridge the digital divide.

  4. D0 experiment: its trigger, data acquisition, and computers

    International Nuclear Information System (INIS)

    Cutts, D.; Zeller, R.; Schamberger, D.; Van Berg, R.

    1984-05-01

    The new collider facility to be built at Fermilab's Tevatron-I D0 region is described. The data acquisition requirements are discussed, as well as the hardware and software triggers designed to meet these needs. An array of MicroVAX computers running VAXELN will filter in parallel (a complete event in each microcomputer) and transmit accepted events via Ethernet to a host. This system, together with its subsequent offline needs, is briefly presented

  5. Simulation in computer forensics teaching: the student experience

    OpenAIRE

    Crellin, Jonathan; Adda, Mo; Duke-Williams, Emma; Chandler, Jane

    2011-01-01

    The use of simulation in teaching computing is well established, with digital forensic investigation being a subject area where the range of simulation required is both wide and varied demanding a corresponding breadth of fidelity. Each type of simulation can be complex and expensive to set up resulting in students having only limited opportunities to participate and learn from the simulation. For example students' participation in mock trials in the University mock courtroom or in simulation...

  6. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  7. Parallel Computational Fluid Dynamics 2007 : Implementations and Experiences on Large Scale and Grid Computing

    CERN Document Server

    2009-01-01

    At the 19th Annual Conference on Parallel Computational Fluid Dynamics held in Antalya, Turkey, in May 2007, the most recent developments and implementations of large-scale and grid computing were presented. This book, comprised of the invited and selected papers of this conference, details those advances, which are of particular interest to CFD and CFD-related communities. It also offers the results related to applications of various scientific and engineering problems involving flows and flow-related topics. Intended for CFD researchers and graduate students, this book is a state-of-the-art presentation of the relevant methodology and implementation techniques of large-scale computing.

  8. TRANSFORMING RURAL SECONDARY SCHOOLS IN ZIMBABWE THROUGH TECHNOLOGY: LIVED EXPERIENCES OF STUDENT COMPUTER USERS

    Directory of Open Access Journals (Sweden)

    Gomba Clifford

    2016-04-01

    Full Text Available A technological divide exists in Zimbabwe between urban and rural schools that puts rural based students at a disadvantage. In Zimbabwe, the government, through the president donated computers to most rural schools in a bid to bridge the digital divide between rural and urban schools. The purpose of this phenomenological study was to understand the experiences of Advanced Level students using computers at two rural boarding Catholic High Schools in Zimbabwe. The study was guided by two research questions: (1 How do Advanced level students in the rural areas use computers at their school? and (2 What is the experience of using computers for Advanced Level students in the rural areas of Zimbabwe? By performing this study, it was possible to understand from the students’ experiences whether computer usage was for educational learning or not. The results of the phenomenological study showed that students’ experiences can be broadly classified into five themes, namely worthwhile (interesting experience, accessibility issues, teachers’ monopoly, research and social use, and Internet availability. The participants proposed teachers use computers, but not monopolize computer usage. The solution to the computer shortage may be solved by having donors and government help in the acquisitioning of more computers.

  9. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  10. Computation for LHC experiments: a worldwide computing grid; Le calcul scientifique des experiences LHC: une grille de production mondiale

    Energy Technology Data Exchange (ETDEWEB)

    Fairouz, Malek [Universite Joseph-Fourier, LPSC, CNRS-IN2P3, Grenoble I, 38 (France)

    2010-08-15

    In normal operating conditions the LHC detectors are expected to record about 10{sup 10} collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10{sup 9} octets per second and recording capacity of a few tens of 10{sup 15} octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  11. Computer-assisted experiments with a laser diode

    Energy Technology Data Exchange (ETDEWEB)

    Kraftmakher, Yaakov, E-mail: krafty@mail.biu.ac.il [Department of Physics, Bar-Ilan University, Ramat-Gan 52900 (Israel)

    2011-05-15

    A laser diode from an inexpensive laser pen (laser pointer) is used in simple experiments. The radiant output power and efficiency of the laser are measured, and polarization of the light beam is shown. The h/e ratio is available from the threshold of spontaneous emission. The lasing threshold is found using several methods. With a data-acquisition system, the measurements are possible in a short time. The frequency response of the laser diode is determined in the range 10-10{sup 7} Hz. The experiments are suitable for undergraduate laboratories and for classroom demonstrations on semiconductors.

  12. Computer-assisted experiments with a laser diode

    International Nuclear Information System (INIS)

    Kraftmakher, Yaakov

    2011-01-01

    A laser diode from an inexpensive laser pen (laser pointer) is used in simple experiments. The radiant output power and efficiency of the laser are measured, and polarization of the light beam is shown. The h/e ratio is available from the threshold of spontaneous emission. The lasing threshold is found using several methods. With a data-acquisition system, the measurements are possible in a short time. The frequency response of the laser diode is determined in the range 10-10 7 Hz. The experiments are suitable for undergraduate laboratories and for classroom demonstrations on semiconductors.

  13. COMPUTER EXPERIMENTS WITH FINITE ELEMENTS OF HIGHER ORDER

    Directory of Open Access Journals (Sweden)

    Khomchenko A.

    2017-12-01

    Full Text Available The paper deals with the problem of constructing the basic functions of a quadrilateral finite element of the fifth order by the means of the computer algebra system Maple. The Lagrangian approximation of such a finite element contains 36 nodes: 20 nodes perimeter and 16 internal nodes. Alternative models with reduced number of internal nodes are considered. Graphs of basic functions and cognitive portraits of lines of zero level are presented. The work is aimed at studying the possibilities of using modern information technologies in the teaching of individual mathematical disciplines.

  14. Computer-Assisted Experiments with a Laser Diode

    Science.gov (United States)

    Kraftmakher, Yaakov

    2011-01-01

    A laser diode from an inexpensive laser pen (laser pointer) is used in simple experiments. The radiant output power and efficiency of the laser are measured, and polarization of the light beam is shown. The "h/e" ratio is available from the threshold of spontaneous emission. The lasing threshold is found using several methods. With a…

  15. Experience with computed transmission tomography of the heart in vivo

    International Nuclear Information System (INIS)

    Carlsson, E.; Lipton, M.J.; Skioeldebrand, C.G.; Berninger, W.H.; Redington, R.W.

    1980-01-01

    Cardiac computed tomography in its present form provides useful information about the heart for clinical use in patients with heart disease and for investigative work in such patients and living animals. Its great reconstructive power and unmatched density resolution are particularly advantageous in the study of ischemic heart disease. Because of its non-invasive character cardiac computed tomography has the potential of becoming an effective screening tool for large numbers of patients with suspected or known coronary heart desiase. Other cardiac conditions such as valve disease and congenital lesions can also be examined with high diagnostic yield. However presently available scanners suffer from low repetion rate, long scan times and the fact that only one transverse cardiac level at a time can be obtained. The development which must be accomplished in order to eliminate these weaknesses is technically feasible. The availability of a dynamic cardiac scanner would greatly benefit the treatment of patients with heart disease and facilitate the inquiry into the pathophysiology of such diseases. (orig.) [de

  16. Integration of genetic algorithm, computer simulation and design of experiments for forecasting electrical energy consumption

    International Nuclear Information System (INIS)

    Azadeh, A.; Tarverdian, S.

    2007-01-01

    This study presents an integrated algorithm for forecasting monthly electrical energy consumption based on genetic algorithm (GA), computer simulation and design of experiments using stochastic procedures. First, time-series model is developed as a benchmark for GA and simulation. Computer simulation is developed to generate random variables for monthly electricity consumption. This is achieved to foresee the effects of probabilistic distribution on monthly electricity consumption. The GA and simulated-based GA models are then developed by the selected time-series model. Therefore, there are four treatments to be considered in analysis of variance (ANOVA) which are actual data, time series, GA and simulated-based GA. Furthermore, ANOVA is used to test the null hypothesis of the above four alternatives being equal. If the null hypothesis is accepted, then the lowest mean absolute percentage error (MAPE) value is used to select the best model, otherwise the Duncan Multiple Range Test (DMRT) method of paired comparison is used to select the optimum model, which could be time series, GA or simulated-based GA. In case of ties the lowest MAPE value is considered as the benchmark. The integrated algorithm has several unique features. First, it is flexible and identifies the best model based on the results of ANOVA and MAPE, whereas previous studies consider the best-fit GA model based on MAPE or relative error results. Second, the proposed algorithm may identify conventional time series as the best model for future electricity consumption forecasting because of its dynamic structure, whereas previous studies assume that GA always provide the best solutions and estimation. To show the applicability and superiority of the proposed algorithm, the monthly electricity consumption in Iran from March 1994 to February 2005 (131 months) is used and applied to the proposed algorithm

  17. Computer simulation of FT-NMR multiple pulse experiment

    Science.gov (United States)

    Allouche, A.; Pouzard, G.

    1989-04-01

    Using the product operator formalism in its real form, SIMULDENS expands the density matrix of a scalar coupled nuclear spin system and simulates analytically a large variety of FT-NMR multiple pulse experiments. The observable transverse magnetizations are stored and can be combined to represent signal accumulation. The programming language is VAX PASCAL, but a MacIntosh Turbo Pascal Version is also available.

  18. Computer simulation of FT-NMR multiple pulse experiment

    International Nuclear Information System (INIS)

    Allouche, A.; Pouzard, G.

    1989-01-01

    Using the product operator formalism in its real form, SIMULDENS expands the density matrix of a scalar coupled nuclear spin system and simulates analytically a large variety of FT-NMR multiple pulse experiments. The observable transverse magnetizations are stored and can be combined to represent signal accumulation. The programming language is VAX PASCAL, but a MacIntosh Turbo Pascal Version is also available. (orig.)

  19. Operational experience with the Sizewell B integrated plant computer system

    International Nuclear Information System (INIS)

    Ladner, J.E.J.; Alexander, N.C.; Fitzpatrick, J.A.

    1997-01-01

    The Westinghouse Integrated System for Centralised Operation (WISCO) is the primary plant control system at the Sizewell B Power Station. It comprises three subsystems; the High Integrity Control System (HICS), the Process Control System (PCS) and the Distributed Computer system (DCS). The HICS performs the control and data acquisition of nuclear safety significant plant systems. The PCS uses redundant data processing unit pairs. The workstations and servers of the DCS communicate with each other over a standard ethernet. The maintenance requirements for every plant system are covered by a Maintenance Strategy Report. The breakdown of these reports is listed. The WISCO system has performed exceptionally well. Due to the diagnostic information presented by the HICS, problems could normally be resolved within 24 hours. There have been some 200 outstanding modifications to the system. The procedure of modification is briefly described. (A.K.)

  20. A model ecosystem experiment and its computational simulation studies

    International Nuclear Information System (INIS)

    Doi, M.

    2002-01-01

    Simplified microbial model ecosystem and its computer simulation model are introduced as eco-toxicity test for the assessment of environmental responses from the effects of environmental impacts. To take the effects on the interactions between species and environment into account, one option is to select the keystone species on the basis of ecological knowledge, and to put it in the single-species toxicity test. Another option proposed is to put the eco-toxicity tests as experimental micro ecosystem study and a theoretical model ecosystem analysis. With these tests, the stressors which are more harmful to the ecosystems should be replace with less harmful ones on the basis of unified measures. Management of radioactive materials, chemicals, hyper-eutrophic, and other artificial disturbances of ecosystem should be discussed consistently from the unified view point of environmental protection. (N.C.)

  1. Computer experiments with a coarse-grid hydrodynamic climate model

    International Nuclear Information System (INIS)

    Stenchikov, G.L.

    1990-01-01

    A climate model is developed on the basis of the two-level Mintz-Arakawa general circulation model of the atmosphere and a bulk model of the upper layer of the ocean. A detailed model of the spectral transport of shortwave and longwave radiation is used to investigate the radiative effects of greenhouse gases. The radiative fluxes are calculated at the boundaries of five layers, each with a pressure thickness of about 200 mb. The results of the climate sensitivity calculations for mean-annual and perpetual seasonal regimes are discussed. The CCAS (Computer Center of the Academy of Sciences) climate model is used to investigate the climatic effects of anthropogenic changes of the optical properties of the atmosphere due to increasing CO 2 content and aerosol pollution, and to calculate the sensitivity to changes of land surface albedo and humidity

  2. Design concepts and experience in the application of distributed computing to the control of large CEGB power plant

    International Nuclear Information System (INIS)

    Wallace, J.N.

    1980-01-01

    With the ever increasing price of fossil fuels it became obvious during the 1970's that Pembroke Power Station (4 x 500MW oil fired) and Didcot Power Station (4 x 500MW coal fired) were going to operate flexibly with many units two-shifting frequently. The region was also expecting to refurbish nuclear plant in the 1980's. Based on previous experience with mini-computers, the region initiated a research/development programme aimed at refitting Pembroke and Didcot using distrubuted computer techniques that were also broadly applicable to nuclear plant. Major schemes have now been implemented at Pembroke and Didcot for plant condition monitoring, control and display. All computers on two units at each station are now functional with a third unit currently being set to work. This paper aims to outline the generic technical aspects of these schemes, describe the implementation strategy adopted and develop some thoughts on nuclear power plant applications. (auth)

  3. Is Self-Employment Really a Bad Experience? The Effects of Previous Self-Employment on Subsequent Wage-Employment Wages

    DEFF Research Database (Denmark)

    Kaiser, Ulrich; Malchow-Møller, Nikolaj

    2011-01-01

    of self-employment is associated with lower hourly wages compared to workers who were consecutively wage-employed. We also show, however, that this effect disappears—and even becomes positive in some settings—for formerly self-employed who find dependent employment in the same sector as their self......-employment sector. Hence, the on average negative effect of self-employment is rather caused by sector switching than by the self-employment experience per se. Moreover, formerly self-employed who either enjoyed a high income or hired at least one worker during their self-employment spell receive wages...... in subsequent dependent employment that are at least as high as for individuals who have been consecutively wage-employed....

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  5. Computing strategy of Alpha-Magnetic Spectrometer experiment

    International Nuclear Information System (INIS)

    Choutko, V.; Klimentov, A.

    2003-01-01

    Alpha-Magnetic Spectrometer (AMS) is an experiment to search in the space for dark matter, missing matter, and antimatter scheduled for being flown on the International Space Station in the fall of year 2005 for at least 3 consecutive years. This paper gives an overview of the AMS software with emphasis on the distributed production system based on client/server approach. We also describe our choice of hardware components to build a processing farm with TByte RAID arrays of IDE disks and highlight the strategies that make our system different from many other experimental systems

  6. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  8. Heterogeneous computation tests of both substitution and reactivity worth experiments in the RB-3 reactor

    International Nuclear Information System (INIS)

    Broccoli, U.; Cambi, G.; Vanossi, A.; Zapellini, G.

    1977-01-01

    This report presents the results of several experiments carried out in the D 2 O-moderated RB-3 reactors at the CNEN's Laboratory of Montecuccolino, Bologna. The experiments referred to are either fuel-element substitution experiments or interstitial absorber experiments and were performed during the period 1972-1974. The results of measurements are compared with those obtained by means of computational procedure based on some ''cell'' codes coupled with heterogeneous codes. (authors)

  9. [Brain-Computer Interface: the First Clinical Experience in Russia].

    Science.gov (United States)

    Mokienko, O A; Lyukmanov, R Kh; Chernikova, L A; Suponeva, N A; Piradov, M A; Frolov, A A

    2016-01-01

    Motor imagery is suggested to stimulate the same plastic mechanisms in the brain as a real movement. The brain-computer interface (BCI) controls motor imagery by converting EEG during this process into the commands for an external device. This article presents the results of two-stage study of the clinical use of non-invasive BCI in the rehabilitation of patients with severe hemiparesis caused by focal brain damage. It was found that the ability to control BCI did not depend on the duration of a disease, brain lesion localization and the degree of neurological deficit. The first step of the study involved 36 patients; it showed that the efficacy of rehabilitation was higher in the group with the use of BCI (the score on the Action Research Arm Test (ARAT) improved from 1 [0; 2] to 5 [0; 16] points, p = 0.012; no significant improvement was observed in control group). The second step of the study involved 19 patients; the complex BCI-exoskeleton (i.e. with the kinesthetic feedback) was used for motor imagery trainings. The improvement of the motor function of hands was proved by ARAT (the score improved from 2 [0; 37] to 4 [1; 45:5] points, p = 0.005) and Fugl-Meyer scale (from 72 [63; 110 ] to 79 [68; 115] points, p = 0.005).

  10. A review of experiments and computer analyses on RIAs

    International Nuclear Information System (INIS)

    Jernkvist, L.O.; Massih, A.R.; In de Betou, J.

    2010-01-01

    Reactivity initiated accidents (RIAs) are nuclear reactor accidents that involve an unwanted increase in fission rate and reactor power. Reactivity initiated accidents in power reactors may occur as a result of reactor control system failures, control element ejections or events caused by rapid changes in temperature or pressure of the coolant/moderator. our current understanding of reactivity initiated accidents and their consequences is based largely on three sources of information: 1) best-estimate computer analyses of the reactor response to postulated accident scenarios, 2) pulse-irradiation tests on instrumented fuel rodlets, carried out in research reactors, 3) out-of-pile separate effect tests, targeted to explore key phenomena under RIA conditions. In recent years, we have reviewed, compiled and analysed these 3 categories of data. The results is a state-of-the-art report on fuel behaviour under RIA conditions, which is currently being published by the OECD Nuclear Energy Agency. The purpose of this paper is to give a brief summary of this report

  11. Experiences of Using Automated Assessment in Computer Science Courses

    Directory of Open Access Journals (Sweden)

    John English

    2015-10-01

    Full Text Available In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students type in their own answers. Students were allowed to correct errors based on feedback provided by the system and resubmit their answers. A total of 141 students were surveyed to assess their opinions of this approach, and we analysed their responses. Analysis of the questionnaire showed a low correlation between questions, indicating the statistical independence of the individual questions. As a whole, student feedback on using Checkpoint was very positive, emphasizing the benefits of multiple attempts, impartial marking, and a quick turnaround time for submissions. Many students said that Checkpoint gave them confidence in learning and motivation to practise. Students also said that the detailed feedback that Checkpoint generated when their programs failed helped them understand their mistakes and how to correct them.

  12. Multislice computed tomographic coronary angiography: experience in a UK centre

    International Nuclear Information System (INIS)

    Morgan-Hughes, G.J.; Marshall, A.J.; Roobottom, C.A.

    2003-01-01

    AIM: To evaluate the technique of coronary angiography with retrospectively electrocardiogram (ECG)-gated four-slice helical computed tomography (CT). MATERIALS AND METHODS: Within 1 month of undergoing routine day-case diagnostic coronary angiography, 30 consecutive patients also underwent retrospectively ECG-gated multislice CT coronary angiography. This enabled direct comparison of seven segments of proximal and mid-coronary artery for each patient by two blinded assessors. Each segment of coronary artery from the multislice CT image was evaluated initially for 'assessability' and those segments deemed assessable were subsequently investigated for the presence or absence of a significantly (n=70%) stenotic lesion. RESULTS: Overall 68% of proximal and mid-coronary artery segments were assessable. The sensitivity and specificity of four-slice CT coronary angiography in assessable segments for detecting the presence or absence (n=70%) of stenoses were 72 and 86%, respectively. These results correspond to a positive predictive value of 53% and a 93% negative predictive value. If the 32% of non-assessable segments are added into the calculation then the sensitivity and specificity fall to 49 and 66%, respectively. CONCLUSION: Although multislice CT coronary angiography is a promising technique, the overall assessability and diagnostic accuracy of four-slice CT acquisition is not sufficient to justify routine clinical use. Further, evaluation should investigate the benefit of the reduction in temporal and spatial resolution offered by 16 and 32 slice acquisition

  13. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  14. Using Educational Computer Games in the Classroom: Science Teachers' Experiences, Attitudes, Perceptions, Concerns, and Support Needs

    Science.gov (United States)

    An, Yun-Jo; Haynes, Linda; D'Alba, Adriana; Chumney, Frances

    2016-01-01

    Science teachers' experiences, attitudes, perceptions, concerns, and support needs related to the use of educational computer games were investigated in this study. Data were collected from an online survey, which was completed by 111 science teachers. The results showed that 73% of participants had used computer games in teaching. Participants…

  15. Computer based workstation for development of software for high energy physics experiments

    International Nuclear Information System (INIS)

    Ivanchenko, I.M.; Sedykh, Yu.V.

    1987-01-01

    Methodical principles and results of a successful attempt to create on the base of IBM-PC/AT personal computer of effective means for development of programs for high energy physics experiments are analysed. The obtained results permit to combine the best properties and a positive materialized experience accumulated on the existing time sharing collective systems with a high quality of data representation, reliability and convenience of personal computer applications

  16. Coupling between eddy currents and rigid body rotation: analysis, computation, and experiments

    International Nuclear Information System (INIS)

    Hua, T.Q.; Turner, L.R.

    1985-01-01

    Computation and experiment show that the coupling between eddy currents and the angular deflections resulting from those eddy currents can reduce electromagnetic effects such as forces, torques, and power dissipation to levels far less severe than would be predicted without regard for the coupling. This paper explores the coupling effects beyond the parameter range that has been explored experimentally, using analytical means and the eddy-current computer code EDDYNET. The paper also describes upcoming FELIX experiments with cantilevered beams

  17. Computer-assisted training experiment used in the field of thermal energy production (EDF)

    International Nuclear Information System (INIS)

    Felgines, R.

    1982-01-01

    In 1981, the EDF carried out an experiment with computer-assisted training (EAO). This new approach, which continued until June 1982, involved about 700 employees all of whom operated nuclear power stations. The different stages of this experiment and the lessons which can be drawn from it are given the lessons were of a positive nature and make it possible to envisage complete coverage of all nuclear power stations by computer-assisted training within a very short space of time [fr

  18. Explaining Research Utilization Among 4-H Faculty, Staff, and Volunteers: The Role of Self-Efficacy, Learning Goal Orientation, Training, and Previous Experience

    Directory of Open Access Journals (Sweden)

    Julianne Tillman

    2014-06-01

    Full Text Available An investigation of factors that facilitate the utilization of research evidence among faculty, staff, and volunteers in the 4-H Youth Development Program is presented in this paper. Participants (N= 368; 86 4-H faculty, 153 staff, and 129 volunteers represented 35 states; structural equation modeling was utilized in the analyses. Results of the path analysis explained 56% of variance in research utilization and 28% in research utilization self-efficacy. Among the factors impacting research utilization, self-efficacy played the most important role. In turn, self-efficacy for research utilization was positively influenced by participants’ learning goal orientation, frequency of 4-H training during the last 12 months, education in research-related areas, and investigative career interests. In addition, 4-H staff who were exposed to research at higher levels reported higher research utilization self-efficacy. The findings reinforce the importance of fostering research utilization self-efficacy among 4-H faculty, staff, and volunteers. Among the suggestions presented are regular 4-H training opportunities and on-going exposure to program evaluation and program improvement experiences.

  19. Experiences using SciPy for computer vision research

    Energy Technology Data Exchange (ETDEWEB)

    Eads, Damian R [Los Alamos National Laboratory; Rosten, Edward J [Los Alamos National Laboratory

    2008-01-01

    SciPy is an effective tool suite for prototyping new algorithms. We share some of our experiences using it for the first time to support our research in object detection. SciPy makes it easy to integrate C code, which is essential when algorithms operating on large data sets cannot be vectorized. The universality of Python, the language in which SciPy was written, gives the researcher access to a broader set of non-numerical libraries to support GUI development, interface with databases, manipulate graph structures. render 3D graphics, unpack binary files, etc. Python's extensive support for operator overloading makes SciPy's syntax as succinct as its competitors, MATLAB, Octave, and R. More profoundly, we found it easy to rework research code written with SciPy into a production application, deployable on numerous platforms.

  20. Monitoring of computing resource utilization of the ATLAS experiment

    International Nuclear Information System (INIS)

    Rousseau, David; Vukotic, Ilija; Schaffer, RD; Dimitrov, Gancho; Aidel, Osman; Albrand, Solveig

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  1. The TESS [Tandem Experiment Simulation Studies] computer code user's manual

    International Nuclear Information System (INIS)

    Procassini, R.J.

    1990-01-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs

  2. A cerebellar neuroprosthetic system: computational architecture and in vivo experiments

    Directory of Open Access Journals (Sweden)

    Ivan eHerreros Alonso

    2014-05-01

    Full Text Available Emulating the input-output functions performed by a brain structure opens the possibility for developing neuro-prosthetic systems that replace damaged neuronal circuits. Here, we demonstrate the feasibility of this approach by replacing the cerebellar circuit responsible for the acquisition and extinction of motor memories. Specifically, we show that a rat can undergo acquisition, retention and extinction of the eye-blink reflex even though the biological circuit responsible for this task has been chemically inactivated via anesthesia. This is achieved by first developing a computational model of the cerebellar microcircuit involved in the acquisition of conditioned reflexes and training it with synthetic data generated based on physiological recordings. Secondly, the cerebellar model is interfaced with the brain of an anesthetized rat, connecting the model's inputs and outputs to afferent and efferent cerebellar structures. As a result, we show that the anesthetized rat, equipped with our neuro-prosthetic system, can be classically conditioned to the acquisition of an eye-blink response. However, non-stationarities in the recorded biological signals limit the performance of the cerebellar model. Thus, we introduce an updated cerebellar model and validate it with physiological recordings showing that learning becomes stable and reliable. The resulting system represents an important step towards replacing lost functions of the central nervous system via neuro-prosthetics, obtained by integrating a synthetic circuit with the afferent and efferent pathways of a damaged brain region. These results also embody an early example of science-based medicine, where on the one hand the neuro-prosthetic system directly validates a theory of cerebellar learning that informed the design of the system, and on the other one it takes a step towards the development of neuro-prostheses that could recover lost learning functions in animals and, in the longer term

  3. Automatization of physical experiments on-line with the MINSK-32 computer

    International Nuclear Information System (INIS)

    Fefilov, B.V.; Mikhushkin, A.V.; Morozov, V.M.; Sukhov, A.M.; Chelnokov, L.P.

    1978-01-01

    The system for data acquisition and processing of complex multi-dimensional experiments is described. The system includes the autonomous modules in the CAMAC standard, the NAIRI-4 small computer and the MINSK-32 base computer. The NAIRI-4 computer effects preliminary storage, data processing and experiment control. Its software includes the microprogram software of the NAIRI-4 computer, the software of the NAIRI-2 computer, the software of the PDP-11 computer, the technological software on the Es computers. A crate controller and a display driver are connected to the main channel for the operation of the NAIRI-4 computer on line with experimental devices. An input-output channel commutator, which transforms the MINSK-32 computer levels to the TTL levels and vice versa, was developed to enlarge the possibilities of the connection of the measurement modules to the MINSK-32 computer. The graphic display on the basis of the HP-1300A monitor with a light pencil is used for highly effective spectrum processing

  4. Computer network that assists in the planning, execution and evaluation of in-reactor experiments

    International Nuclear Information System (INIS)

    Bauer, T.H.; Froehle, P.H.; August, C.; Baldwin, R.D.; Johanson, E.W.; Kraimer, M.R.; Simms, R.; Klickman, A.E.

    1985-01-01

    For over 20 years complex, in-reactor experiments have been performed at Argonne National Laboratory (ANL) to investigate the performance of nuclear reactor fuel and to support the development of large computer codes that address questions of reactor safety in full-scale plants. Not only are computer codes an important end-product of the research, but computer analysis is also involved intimately at most stages of experiment planning, data reduction, and evaluation. For instance, many experiments are of sufficiently long duration or, if they are of brief duration, occur in such a purposeful sequence that need for speedy availability of on-line data is paramount. This is made possible most efficiently by computer assisted displays and evaluation. A purposeful linking of main-frame, mini, and micro computers has been effected over the past eight years which greatly enhances the speed with which experimental data are reduced to useful forms and applied to the relevant technological issues. This greater efficiency in data management led also to improvements in the planning and execution of subsequent experiments. Raw data from experiments performed at INEL is stored directly on disk and tape with the aid of minicomputers. Either during or shortly after an experiment, data may be transferred, via a direct link, to the Illinois offices of ANL where the data base is stored on a minicomputer system. This Idaho-to-Illinois link has both enhanced experiment performance and allowed rapid dissemination of results

  5. Hardware for dynamic quantum computing experiments: Part I

    Science.gov (United States)

    Johnson, Blake; Ryan, Colm; Riste, Diego; Donovan, Brian; Ohki, Thomas

    Static, pre-defined control sequences routinely achieve high-fidelity operation on superconducting quantum processors. Efforts toward dynamic experiments depending on real-time information have mostly proceeded through hardware duplication and triggers, requiring a combinatorial explosion in the number of channels. We provide a hardware efficient solution to dynamic control with a complete platform of specialized FPGA-based control and readout electronics; these components enable arbitrary control flow, low-latency feedback and/or feedforward, and scale far beyond single-qubit control and measurement. We will introduce the BBN Arbitrary Pulse Sequencer 2 (APS2) control system and the X6 QDSP readout platform. The BBN APS2 features: a sequencer built around implementing short quantum gates, a sequence cache to allow long sequences with branching structures, subroutines for code re-use, and a trigger distribution module to capture and distribute steering information. The X6 QDSP features a single-stage DSP pipeline that combines demodulation with arbitrary integration kernels, and multiple taps to inspect data flow for debugging and calibration. We will show system performance when putting it all together, including a latency budget for feedforward operations. This research was funded by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), through the Army Research Office Contract No. W911NF-10-1-0324.

  6. Cooperation of experts' opinion, experiment and computer code development

    International Nuclear Information System (INIS)

    Wolfert, K.; Hicken, E.

    The connection between code development, code assessment and confidence in the analysis of transients will be discussed. In this manner, the major sources of errors in the codes and errors in applications of the codes will be shown. Standard problem results emphasize that, in order to have confidence in licensing statements, the codes must be physically realistic and the code user must be qualified and experienced. We will discuss why there is disagreement between the licensing authority and vendor concerning assessment of the fullfillment of safety goal requirements. The answer to the question lies in the different confidence levels of the assessment of transient analysis. It is expected that a decrease in the disagreement will result from an increased confidence level. Strong efforts will be made to increase this confidence level through improvements in the codes, experiments and related organizational strcutures. Because of the low probability for loss-of-coolant-accidents in the nuclear industry, assessment must rely on analytical techniques and experimental investigations. (orig./HP) [de

  7. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  8. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  9. Comparative study on the performance of Pod type waterjet by experiment and computation

    Directory of Open Access Journals (Sweden)

    Moon-Chan Kim

    2010-03-01

    Full Text Available A comparative study between a computation and an experiment has been conducted to predict the performance of a Pod type waterjet for an amphibious wheeled vehicle. The Pod type waterjet has been chosen on the basis of the required specific speed of more than 2500. As the Pod type waterjet is an extreme type of axial flow type waterjet, theoretical as well as experimental works about Pod type waterjets are very rare. The main purpose of the present study is to validate and compare to the experimental results of the Pod type waterjet with the developed CFD in-house code based on the RANS equations. The developed code has been validated by comparing with the experimental results of the well-known turbine problem. The validation also extended to the flush type waterjet where the pressures along the duct surface and also velocities at nozzle area have been compared with experimental results. The Pod type waterjet has been designed and the performance of the designed waterjet system including duct, impeller and stator was analyzed by the previously mentioned in-house CFD Code. The pressure distributions and limiting streamlines on the blade surfaces were computed to confirm the performance of the designed waterjets. In addition, the torque and momentum were computed to find the entire efficiency and these were compared with the model test results. Measurements were taken of the flow rate at the nozzle exit, static pressure at the various sections along the duct and also the nozzle, revolution of the impeller, torque, thrust and towing forces at various advance speeds for the prediction of performance as well as for comparison with the computations. Based on these measurements, the performance was analyzed according to the ITTC96 standard analysis method. The full-scale effective and the delivered power of the wheeled vehicle were estimated for the prediction of the service speed. This paper emphasizes the confirmation of the ITTC96 analysis method and

  10. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  11. Une Experience d'enseignement du francais par ordinateur (An Experiment in Teaching French by Computer).

    Science.gov (United States)

    Bougaieff, Andre; Lefebvre, France

    1986-01-01

    An experimental program for university summer students of French as a second language that provided a computer resource center and a variety of courseware, authoring aids, and other software for student use is described and the problems and advantages are discussed. (MSE)

  12. Previous Experience a Model of Practice UNAE

    Directory of Open Access Journals (Sweden)

    Ormary Barberi Ruiz

    2017-02-01

    Full Text Available The statements presented in this article represents a preliminary version of the proposed model of pre-professional practices (PPP of the National University of Education (UNAE of Ecuador, an urgent institutional necessity is revealed in the descriptive analyzes conducted from technical support - administrative (reports, interviews, testimonials, pedagogical foundations of UNAE (curricular directionality, transverse axes in practice, career plan, approach and diagnostic examination as subject nature of the pre professional practice and the demand of socio educational contexts where the practices have been emerging to resize them. By relating these elements allowed conceiving the modeling of the processes of the pre-professional practices for the development of professional skills of future teachers through four components: contextual projective, implementation (tutoring, accompaniment (teaching couple and monitoring (meetings at the beginning, during and end of practice. The initial training of teachers is inherent to teaching (academic and professional training, research and links with the community, these are fundamental pillars of Ecuadorian higher education.

  13. Previous Experience a Model of Practice UNAE

    OpenAIRE

    Ormary Barberi Ruiz; María Dolores Pesántez Palacios

    2017-01-01

    The statements presented in this article represents a preliminary version of the proposed model of pre-professional practices (PPP) of the National University of Education (UNAE) of Ecuador, an urgent institutional necessity is revealed in the descriptive analyzes conducted from technical support - administrative (reports, interviews, testimonials), pedagogical foundations of UNAE (curricular directionality, transverse axes in practice, career plan, approach and diagnostic examination as subj...

  14. Previous experiences shape adaptive mate preferences

    NARCIS (Netherlands)

    Fawcett, Tim W.; Bleay, Colin

    2009-01-01

    Existing models of mate choice assume that individuals have perfect knowledge of their own ability to attract a mate and can adjust their preferences accordingly. However, real animals will typically be uncertain of their own attractiveness. A potentially useful source of information on this is the

  15. Evaluating user experience with respect to user expectations in brain-computer interface games

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, G.; Poel, Mannes; Müller-Putz, G.R.; Scherer, R.; Billinger, M.; Kreilinger, A.; Kaiser, V.; Neuper, C.

    Evaluating user experience (UX) with respect to previous experiences can provide insight into whether a product can positively aect a user's opinion about a technology. If it can, then we can say that the product provides a positive UX. In this paper we propose a method to assess the UX in BCI

  16. Optically stimulated luminescence sensitivity changes in quartz due to repeated use in single aliquot readout: experiments and computer simulations

    International Nuclear Information System (INIS)

    McKeever, S.W.S.; Oklahoma State Univ., Stillwater, OK; Boetter-Jensen, L.; Agersnap Larsen, N.; Mejdahl, V.; Poolton, N.R.J.

    1996-01-01

    As part of a study to examine sensitivity changes in single aliquot techniques using optically stimulated luminescence (OSL) a series of experiments has been conducted with single aliquots of natural quartz, and the data compared with the results of computer simulations of the type of processes believed to be occurring. The computer model used includes both shallow and deep ('hard-to-bleach') traps, OSL ('easy-to-bleach') traps, and radiative and non-radiative recombination centres. The model has previously been used successfully to account for sensitivity changes in quartz due to thermal annealing. The simulations are able to reproduce qualitatively the main features of the experimental results including sensitivity changes as a function of re-use, and their dependence upon bleaching time and laboratory dose. The sensitivity changes are believed to be the result of a combination of shallow trap and deep trap effects. (author)

  17. Optically stimulated luminescence sensitivity changes in quartz due to repeated use in single aliquot readout: Experiments and computer simulations

    DEFF Research Database (Denmark)

    McKeever, S.W.S.; Bøtter-Jensen, L.; Agersnap Larsen, N.

    1996-01-01

    believed to be occurring. The computer model used includes both shallow and deep ('hard-to-bleach') traps, OSL ('easy-to-bleach') traps, and radiative and non-radiative recombination centres. The model has previously been used successfully to account for sensitivity changes in quartz due to thermal......As part of a study to examine sensitivity changes in single aliquot techniques using optically stimulated luminescence (OSL) a series of experiments has been conducted with single aliquots of natural quartz, and the data compared with the results of computer simulations of the type of processes...... annealing. The simulations are able to reproduce qualitatively the main features of the experimental results including sensitivity changes as a function of reuse, and their dependence upon bleaching time and laboratory dose. The sensitivity changes are believed to be the result of a combination of shallow...

  18. Complete distributed computing environment for a HEP experiment: experience with ARC-connected infrastructure for ATLAS

    International Nuclear Information System (INIS)

    Read, A; Taga, A; O-Saada, F; Pajchel, K; Samset, B H; Cameron, D

    2008-01-01

    Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation

  19. Complete distributed computing environment for a HEP experiment: experience with ARC-connected infrastructure for ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Read, A; Taga, A; O-Saada, F; Pajchel, K; Samset, B H; Cameron, D [Department of Physics, University of Oslo, P.b. 1048 Blindern, N-0316 Oslo (Norway)], E-mail: a.l.read@fys.uio.no

    2008-07-15

    Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation.

  20. The rheology of concentrated dispersions: structure changes and shear thickening in experiments and computer simulations

    NARCIS (Netherlands)

    Boersma, W.H.; Laven, J.; Stein, H.N.; Moldenaers, P.; Keunings, R.

    1992-01-01

    The flow-induced changes in the microstructure and rheol. of very concd., shear thickening dispersions are studied. Results obtained for polystyrene sphere dispersions are compared with previous data and computer simulations to give better insight into the processes occurring in the dispersions. [on

  1. EDUCATIONAL COMPUTER SIMULATION EXPERIMENT «REAL-TIME SINGLE-MOLECULE IMAGING OF QUANTUM INTERFERENCE»

    Directory of Open Access Journals (Sweden)

    Alexander V. Baranov

    2015-01-01

    Full Text Available Taking part in the organized project activities students of the technical University create virtual physics laboratories. The article gives an example of the student’s project-computer modeling and visualization one of the most wonderful manifestations of reality-quantum interference of particles. The real experiment with heavy organic fluorescent molecules is used as a prototype for this computer simulation. The student’s software product can be used in informational space of the system of open education.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  3. Grid computing in pakistan and: opening to large hadron collider experiments

    International Nuclear Information System (INIS)

    Batool, N.; Osman, A.; Mahmood, A.; Rana, M.A.

    2009-01-01

    A grid computing facility was developed at sister institutes Pakistan Institute of Nuclear Science and Technology (PINSTECH) and Pakistan Institute of Engineering and Applied Sciences (PIEAS) in collaboration with Large Hadron Collider (LHC) Computing Grid during early years of the present decade. The Grid facility PAKGRID-LCG2 as one of the grid node in Pakistan was developed employing mainly local means and is capable of supporting local and international research and computational tasks in the domain of LHC Computing Grid. Functional status of the facility is presented in terms of number of jobs performed. The facility developed provides a forum to local researchers in the field of high energy physics to participate in the LHC experiments and related activities at European particle physics research laboratory (CERN), which is one of the best physics laboratories in the world. It also provides a platform of an emerging computing technology (CT). (author)

  4. Computer-controlled back scattering and sputtering-experiment using a heavy-ion-accelerator

    International Nuclear Information System (INIS)

    Becker, H.; Birnbaum, M.; Degenhardt, K.H.; Mertens, P.; Tschammer, V.

    1978-12-01

    Control and data acquisition of a PDP 11/40 computer and CAMAC instrumentation are reported for an experiment that has been developed to measure sputtering in yields and energy losses for heavy 100 - 300 keV ions in thin metal foils. Besides a quadrupole mass filter or a bending magnet, a multichannel analyser is coupled to the computer, so that also pulse height analysis can be performed under computer control. CAMAC instrumentation and measuring programs are built in a modular form to enable an easy application to other experimental problems. (orig.) 891 KBE/orig. 892 BRE

  5. Computer assisted treatments for image pattern data of laser plasma experiments

    International Nuclear Information System (INIS)

    Yaoita, Akira; Matsushima, Isao

    1987-01-01

    An image data processing system for laser-plasma experiments has been constructed. These image data are two dimensional images taken by X-ray, UV, infrared and visible light television cameras and also taken by streak cameras. They are digitized by frame memories. The digitized image data are stored in disk memories with the aid of a microcomputer. The data are processed by a host computer and stored in the files of the host computer and on magnetic tapes. In this paper, the over view of the image data processing system and some software for data handling in the host computer are reported. (author)

  6. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  7. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    Science.gov (United States)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called

  8. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    Energy Technology Data Exchange (ETDEWEB)

    Herner, K. [Fermilab; Alba Hernandex, A. F. [Fermilab; Bhat, S. [Fermilab; Box, D. [Fermilab; Boyd, J. [Fermilab; Di Benedetto, V. [Fermilab; Ding, P. [Fermilab; Dykstra, D. [Fermilab; Fattoruso, M. [Fermilab; Garzoglio, G. [Fermilab; Kirby, M. [Fermilab; Kreymer, A. [Fermilab; Levshina, T. [Fermilab; Mazzacane, A. [Fermilab; Mengel, M. [Fermilab; Mhashilkar, P. [Fermilab; Podstavkov, V. [Fermilab; Retzke, K. [Fermilab; Sharma, N. [Fermilab; Teheran, J. [Fermilab

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed

  9. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  10. DABIE: a data banking system of integral experiments for reactor core characteristics computer codes

    International Nuclear Information System (INIS)

    Matsumoto, Kiyoshi; Naito, Yoshitaka; Ohkubo, Shuji; Aoyanagi, Hideo.

    1987-05-01

    A data banking system of integral experiments for reactor core characteristics computer codes, DABIE, has been developed to lighten the burden on searching so many documents to obtain experiment data required for verification of reactor core characteristics computer code. This data banking system, DABIE, has capabilities of systematic classification, registration and easy retrieval of experiment data. DABIE consists of data bank and supporting programs. Supporting programs are data registration program, data reference program and maintenance program. The system is designed so that user can easily register information of experiment systems including figures as well as geometry data and measured data or obtain those data through TSS terminal interactively. This manual describes the system structure, how-to-use and sample uses of this code system. (author)

  11. Computational methods for fracture analysis of heavy-section steel technology (HSST) pressure vessel experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1983-01-01

    This paper summarizes the capabilities and applications of the general-purpose and special-purpose computer programs that have been developed for use in fracture mechanics analyses of HSST pressure vessel experiments. Emphasis is placed on the OCA/USA code, which is designed for analysis of pressurized-thermal-shock (PTS) conditions, and on the ORMGEN/ADINA/ORVIRT system which is used for more general analysis. Fundamental features of these programs are discussed, along with applications to pressure vessel experiments

  12. FPGA Compute Acceleration for High-Throughput Data Processing in High-Energy Physics Experiments

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The upgrades of the four large experiments of the LHC at CERN in the coming years will result in a huge increase of data bandwidth for each experiment which needs to be processed very efficiently. For example the LHCb experiment will upgrade its detector 2019/2020 to a 'triggerless' readout scheme, where all of the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40MHz. This increases the data bandwidth from the detector down to the event filter farm to 40TBit/s, which must be processed to select the interesting proton-proton collisions for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered.    In the high performance computing sector more and more FPGA compute accelerators are being used to improve the compute performance and reduce the...

  13. Use of Tablet Computers to Promote Physical Therapy Students' Engagement in Knowledge Translation During Clinical Experiences

    Science.gov (United States)

    Loeb, Kathryn; Barbosa, Sabrina; Jiang, Fei; Lee, Karin T.

    2016-01-01

    Background and Purpose: Physical therapists strive to integrate research into daily practice. The tablet computer is a potentially transformational tool for accessing information within the clinical practice environment. The purpose of this study was to measure and describe patterns of tablet computer use among physical therapy students during clinical rotation experiences. Methods: Doctor of physical therapy students (n = 13 users) tracked their use of tablet computers (iPad), loaded with commercially available apps, during 16 clinical experiences (6-16 weeks in duration). Results: The tablets were used on 70% of 691 clinic days, averaging 1.3 uses per day. Information seeking represented 48% of uses; 33% of those were foreground searches for research articles and syntheses and 66% were for background medical information. Other common uses included patient education (19%), medical record documentation (13%), and professional communication (9%). The most frequently used app was Safari, the preloaded web browser (representing 281 [36.5%] incidents of use). Users accessed 56 total apps to support clinical practice. Discussion and Conclusions: Physical therapy students successfully integrated use of a tablet computer into their clinical experiences including regular activities of information seeking. Our findings suggest that the tablet computer represents a potentially transformational tool for promoting knowledge translation in the clinical practice environment. Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A127). PMID:26945431

  14. Using a Computer Microphone Port to Study Circular Motion: Proposal of a Secondary School Experiment

    Science.gov (United States)

    Soares, A. A.; Borcsik, F. S.

    2016-01-01

    In this work we present an inexpensive experiment proposal to study the kinematics of uniform circular motion in a secondary school. We used a PC sound card to connect a homemade simple sensor to a computer and used the free sound analysis software "Audacity" to record experimental data. We obtained quite good results even in comparison…

  15. Experiments Using Cell Phones in Physics Classroom Education: The Computer-Aided "g" Determination

    Science.gov (United States)

    Vogt, Patrik; Kuhn, Jochen; Muller, Sebastian

    2011-01-01

    This paper continues the collection of experiments that describe the use of cell phones as experimental tools in physics classroom education. We describe a computer-aided determination of the free-fall acceleration "g" using the acoustical Doppler effect. The Doppler shift is a function of the speed of the source. Since a free-falling objects…

  16. Evaluating a multi-player brain-computer interface game: challenge versus co-experience

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Volpe, G; Reidsma, Dennis; Poel, Mannes; Camurri, A.; Obbink, Michel; Nijholt, Antinus

    2013-01-01

    Brain–computer interfaces (BCIs) have started to be considered as game controllers. The low level of control they provide prevents them from providing perfect control but allows the design of challenging games which can be enjoyed by players. Evaluation of enjoyment, or user experience (UX), is

  17. Computational Modeling of the Optical Rotation of Amino Acids: An "in Silico" Experiment for Physical Chemistry

    Science.gov (United States)

    Simpson, Scott; Autschbach, Jochen; Zurek, Eva

    2013-01-01

    A computational experiment that investigates the optical activity of the amino acid valine has been developed for an upper-level undergraduate physical chemistry laboratory course. Hybrid density functional theory calculations were carried out for valine to confirm the rule that adding a strong acid to a solution of an amino acid in the l…

  18. Evaluating the Relationship of Computer Literacy Training Competence and Nursing Experience to CPIS Resistance

    Science.gov (United States)

    Reese, Dorothy J.

    2012-01-01

    The purpose of this quantitative, descriptive/correlational project was to examine the relationship between the level of computer literacy, informatics training, nursing experience, and perceived competence in using computerized patient information systems (CPIS) and nursing resistance to using CPIS. The Nurse Computerized Patient Information…

  19. Development and application of a computer model for large-scale flame acceleration experiments

    International Nuclear Information System (INIS)

    Marx, K.D.

    1987-07-01

    A new computational model for large-scale premixed flames is developed and applied to the simulation of flame acceleration experiments. The primary objective is to circumvent the necessity for resolving turbulent flame fronts; this is imperative because of the relatively coarse computational grids which must be used in engineering calculations. The essence of the model is to artificially thicken the flame by increasing the appropriate diffusivities and decreasing the combustion rate, but to do this in such a way that the burn velocity varies with pressure, temperature, and turbulence intensity according to prespecified phenomenological characteristics. The model is particularly aimed at implementation in computer codes which simulate compressible flows. To this end, it is applied to the two-dimensional simulation of hydrogen-air flame acceleration experiments in which the flame speeds and gas flow velocities attain or exceed the speed of sound in the gas. It is shown that many of the features of the flame trajectories and pressure histories in the experiments are simulated quite well by the model. Using the comparison of experimental and computational results as a guide, some insight is developed into the processes which occur in such experiments. 34 refs., 25 figs., 4 tabs

  20. ONTOLOGY OF COMPUTATIONAL EXPERIMENT ORGANIZATION IN PROBLEMS OF SEARCHING AND SORTING

    Directory of Open Access Journals (Sweden)

    A. Spivakovsky

    2011-05-01

    Full Text Available Ontologies are a key technology of semantic processing of knowledge. We examine a methodology of ontology’s usage for the organization of computational experiment in problems of searching and sorting in studies of the course "Basics of algorithms and programming".

  1. Solution of the Schrodinger Equation for a Diatomic Oscillator Using Linear Algebra: An Undergraduate Computational Experiment

    Science.gov (United States)

    Gasyna, Zbigniew L.

    2008-01-01

    Computational experiment is proposed in which a linear algebra method is applied to the solution of the Schrodinger equation for a diatomic oscillator. Calculations of the vibration-rotation spectrum for the HCl molecule are presented and the results show excellent agreement with experimental data. (Contains 1 table and 1 figure.)

  2. Computational Experience with Globally Convergent Descent Methods for Large Sparse Systems of Nonlinear Equations

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Vlček, Jan

    1998-01-01

    Roč. 8, č. 3-4 (1998), s. 201-223 ISSN 1055-6788 R&D Projects: GA ČR GA201/96/0918 Keywords : nonlinear equations * Armijo-type descent methods * Newton-like methods * truncated methods * global convergence * nonsymmetric linear systems * conjugate gradient -type methods * residual smoothing * computational experiments Subject RIV: BB - Applied Statistics, Operational Research

  3. Profile modification computations for LHCD experiments on PBX-M using the TSC/LSC model

    International Nuclear Information System (INIS)

    Kaita, R.; Ignat, D.W.; Jardin, S.C.; Okabayashi, M.; Sun, Y.C.

    1996-01-01

    The TSC-LSC computational model of the dynamics of lower hybrid current drive has been exercised extensively in comparison with data from a Princeton Beta Experiment-Modification (PBX-M) discharge where the measured q(0) attained values slightly above unity. Several significant, but plausible, assumptions had to be introduced to keep the computation from behaving pathologically over time, producing singular profiles of plasma current density and q. Addition of a heuristic current diffusion estimate, or more exactly, a smoothing of the rf-driven current with a diffusion-like equation, greatly improved the behavior of the computation, and brought theory and measurement into reasonable agreement. The model was then extended to longer pulse lengths and higher powers to investigate performance to be expected in future PBX-M current profile modification experiments. copyright 1996 American Institute of Physics

  4. Methods of physical experiment and installation automation on the base of computers

    International Nuclear Information System (INIS)

    Stupin, Yu.V.

    1983-01-01

    Peculiarities of using computers for physical experiment and installation automation are considered. Systems for data acquisition and processing on the base of microprocessors, micro- and mini-computers, CAMAC equipment and real time operational systems as well as systems intended for automation of physical experiments on accelerators and installations of laser thermonuclear fusion and installations for plasma investigation are dpscribed. The problems of multimachine complex and multi-user system, arrangement, development of automated systems for collective use, arrangement of intermachine data exchange and control of experimental data base are discussed. Data on software systems used for complex experimental data processing are presented. It is concluded that application of new computers in combination with new possibilities provided for users by universal operational systems essentially exceeds efficiency of a scientist work

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  7. ATLAS Distributed Computing Operations: Experience and improvements after 2 full years of data-taking

    International Nuclear Information System (INIS)

    Jézéquel, S; Stewart, G

    2012-01-01

    This paper summarizes operational experience and improvements in ATLAS computing infrastructure in 2010 and 2011. ATLAS has had 2 periods of data taking, with many more events recorded in 2011 than in 2010. It ran 3 major reprocessing campaigns. The activity in 2011 was similar to 2010, but scalability issues had to be addressed due to the increase in luminosity and trigger rate. Based on improved monitoring of ATLAS Grid computing, the evolution of computing activities (data/group production, their distribution and grid analysis) over time is presented. The main changes in the implementation of the computing model that will be shown are: the optimization of data distribution over the Grid, according to effective transfer rate and site readiness for analysis; the progressive dismantling of the cloud model, for data distribution and data processing; software installation migration to cvmfs; changing database access to a Frontier/squid infrastructure.

  8. Applications of small computers for systems control on the Tandem Mirror Experiment-Upgrade

    International Nuclear Information System (INIS)

    Bork, R.G.; Kane, R.J.; Moore, T.L.

    1983-01-01

    Desktop computers operating into a CAMAC-based interface are used to control and monitor the operation of the various subsystems on the Tandem Mirror Experiment-Upgrade (TMX-U) at Lawrence Livermore National Laboratory (LLNL). These systems include: shot sequencer/master timing, neutral beam control (four consoles), magnet power system control, ion-cyclotron resonant heating (ICRH) control, thermocouple monitoring, getter system control, gas fueling system control, and electron-cyclotron resonant heating (ECRH) monitoring. Two additional computers are used to control the TMX-U neutral beam test stand and provide computer-aided repair/test and development of CAMAC modules. These machines are usually programmed in BASIC, but some codes have been interpreted into assembly language to increase speed. Details of the computer interfaces and system complexity are described as well as the evolution of the systems to their present states

  9. Overview of the assessment of the french in-field tritium experiment with computer codes

    International Nuclear Information System (INIS)

    Crabol, B.; Graziani, G.; Edlund, O.

    1989-01-01

    In the framework of the international cooperation settled for the realization of the French tritium experiment, an expert group for the assessment of computer codes, including the Joint Research Center of Ispra (European Communities), Studsvik (Sweden) and the Atomic Energy Commission (France), has been organized. The aim of the group was as follows: - to help the design of the experiment by evaluating beforehand the consequences of the release, - to interpret the results of the experiment. This paper describes the last task and gives the main conclusions drawn from the work

  10. Computing activities for the P-bar ANDA experiment at FAIR

    International Nuclear Information System (INIS)

    Messchendorp, Johan

    2010-01-01

    The P-bar ANDA experiment at the future facility FAIR will provide valuable data for our present understanding of the strong interaction. In preparation for the experiments, large-scale simulations for design and feasibility studies are performed exploiting a new software framework, P-bar ANDAROOT, which is based on FairROOT and the Virtual Monte Carlo interface, and which runs on a large-scale computing GRID environment exploiting the AliEn 2 middleware. In this paper, an overview is given of the P-bar ANDA experiment with the emphasis on the various developments which are pursuit to provide a user and developer friendly computing environment for the P-bar ANDA collaboration.

  11. Computer navigation experience in hip resurfacing improves femoral component alignment using a conventional jig.

    Science.gov (United States)

    Morison, Zachary; Mehra, Akshay; Olsen, Michael; Donnelly, Michael; Schemitsch, Emil

    2013-11-01

    The use of computer navigation has been shown to improve the accuracy of femoral component placement compared to conventional instrumentation in hip resurfacing. Whether exposure to computer navigation improves accuracy when the procedure is subsequently performed with conventional instrumentation without navigation has not been explored. We examined whether femoral component alignment utilizing a conventional jig improves following experience with the use of imageless computer navigation for hip resurfacing. Between December 2004 and December 2008, 213 consecutive hip resurfacings were performed by a single surgeon. The first 17 (Cohort 1) and the last 9 (Cohort 2) hip resurfacings were performed using a conventional guidewire alignment jig. In 187 cases, the femoral component was implanted using the imageless computer navigation. Cohorts 1 and 2 were compared for femoral component alignment accuracy. All components in Cohort 2 achieved the position determined by the preoperative plan. The mean deviation of the stem-shaft angle (SSA) from the preoperatively planned target position was 2.2° in Cohort 2 and 5.6° in Cohort 1 (P = 0.01). Four implants in Cohort 1 were positioned at least 10° varus compared to the target SSA position and another four were retroverted. Femoral component placement utilizing conventional instrumentation may be more accurate following experience using imageless computer navigation.

  12. Computer navigation experience in hip resurfacing improves femoral component alignment using a conventional jig

    Directory of Open Access Journals (Sweden)

    Zachary Morison

    2013-01-01

    Full Text Available Background:The use of computer navigation has been shown to improve the accuracy of femoral component placement compared to conventional instrumentation in hip resurfacing. Whether exposure to computer navigation improves accuracy when the procedure is subsequently performed with conventional instrumentation without navigation has not been explored. We examined whether femoral component alignment utilizing a conventional jig improves following experience with the use of imageless computer navigation for hip resurfacing. Materials and Methods:Between December 2004 and December 2008, 213 consecutive hip resurfacings were performed by a single surgeon. The first 17 (Cohort 1 and the last 9 (Cohort 2 hip resurfacings were performed using a conventional guidewire alignment jig. In 187 cases, the femoral component was implanted using the imageless computer navigation. Cohorts 1 and 2 were compared for femoral component alignment accuracy. Results:All components in Cohort 2 achieved the position determined by the preoperative plan. The mean deviation of the stem-shaft angle (SSA from the preoperatively planned target position was 2.2° in Cohort 2 and 5.6° in Cohort 1 ( P = 0.01. Four implants in Cohort 1 were positioned at least 10° varus compared to the target SSA position and another four were retroverted. Conclusions: Femoral component placement utilizing conventional instrumentation may be more accurate following experience using imageless computer navigation.

  13. Experience of BESIII data production with local cluster and distributed computing model

    International Nuclear Information System (INIS)

    Deng, Z Y; Li, W D; Liu, H M; Sun, Y Z; Zhang, X M; Lin, L; Nicholson, C; Zhemchugov, A

    2012-01-01

    The BES III detector is a new spectrometer which works on the upgraded high-luminosity collider, BEPCII. The BES III experiment studies physics in the tau-charm energy region from 2 GeV to 4.6 GeV . From 2009 to 2011, BEPCII has produced 106M ψ(2S) events, 225M J/ψ events, 2.8 fb −1 ψ(3770) data, and 500 pb −1 data at 4.01 GeV. All the data samples were processed successfully and many important physics results have been achieved based on these samples. Doing data production correctly and efficiently with limited CPU and storage resources is a big challenge. This paper will describe the implementation of the experiment-specific data production for BESIII in detail, including data calibration with event-level parallel computing model, data reconstruction, inclusive Monte Carlo generation, random trigger background mixing and multi-stream data skimming. Now, with the data sample increasing rapidly, there is a growing demand to move from solely using a local cluster to a more distributed computing model. A distributed computing environment is being set up and expected to go into production use in 2012. The experience of BESIII data production, both with a local cluster and with a distributed computing model, is presented here.

  14. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  15. Digital computer control on Canadian nuclear power plants -experience to date and the future outlook

    International Nuclear Information System (INIS)

    Pearson, A.

    1977-10-01

    This paper discusses the performance of the digital computer control system at Pickering through the years 1973 to 1976. This evaluation is based on a study of the Pickering Generating Station operating records. The paper goes on to explore future computer architectures and the advantages that could accrue from a distributed system approach. Also outlined are the steps being taken to develop these ideas further in the context of two Chalk River projects - REDNET, an advanced data acquisition system being installed to process information from engineering experiments in NRX and NRU reactors, and CRIP, a prototype communications network using cable television technology. (author)

  16. Application of a personal computer in a high energy physics experiment

    International Nuclear Information System (INIS)

    Petta, P.

    1987-04-01

    UA1 is a detector block at the CERN Super Synchrotron Collider, MacVEE is Micro computer applied to the Control of VME Electronic Equipment, a software development system for the data readout system and for the implementation of the user interface of the experiment control. A commercial personal computer is used. Examples of applications are the Data Acquisition Console, the Scanner Desc equipment and the AMERICA Ram Disks codes. Further topics are the MacUA1 development system for M68K-VME codes and an outline of the future MacVEE System Supervisor. 23 refs., 10 figs., 3 tabs. (qui)

  17. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  18. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  20. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  1. SED-ED, a workflow editor for computational biology experiments written in SED-ML.

    Science.gov (United States)

    Adams, Richard R

    2012-04-15

    The simulation experiment description markup language (SED-ML) is a new community data standard to encode computational biology experiments in a computer-readable XML format. Its widespread adoption will require the development of software support to work with SED-ML files. Here, we describe a software tool, SED-ED, to view, edit, validate and annotate SED-ML documents while shielding end-users from the underlying XML representation. SED-ED supports modellers who wish to create, understand and further develop a simulation description provided in SED-ML format. SED-ED is available as a standalone Java application, as an Eclipse plug-in and as an SBSI (www.sbsi.ed.ac.uk) plug-in, all under an MIT open-source license. Source code is at https://sed-ed-sedmleditor.googlecode.com/svn. The application itself is available from https://sourceforge.net/projects/jlibsedml/files/SED-ED/.

  2. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  3. Computational methods for fracture analysis of heavy-section steel technology (HSST) pressure vessel experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1985-01-01

    This paper summarizes the capabilities and applications of the general-purpose and special-purpose computer programs that have been developed at ORNL for use in fracture mechanics analyses of HSST pressure vessel experiments. Emphasis is placed on the OCA/USA code, which is designed for analysis of pressurized-thermal-shock (PTS) conditions, and on the ORMGEN/ADINA/ORVIRT system which is used for more general analysis. Fundamental features of these programs are discussed, along wih applications to pressure vessel experiments. (orig./HP)

  4. Data processing with PC-9801 micro-computer for HCN laser scattering experiments

    International Nuclear Information System (INIS)

    Iwasaki, T.; Okajima, S.; Kawahata, K.; Tetsuka, T.; Fujita, J.

    1986-09-01

    In order to process the data of HCN laser scattering experiments, a micro-computer software has been developed and applied to the measurements of density fluctuations in the JIPP T-IIU tokamak plasma. The data processing system consists of a spectrum analyzer, SM-2100A Signal Analyzer (IWATSU ELECTRIC CO., LTD.), PC-9801m3 micro-computer, a CRT-display and a dot-printer. The output signals from the spectrum analyzer are A/D converted, and stored on a mini-floppy-disk equipped to the signal analyzer. The software to process the data is composed of system-programs and several user-programs. The real time data processing is carried out for every shot of plasma at 4 minutes interval by the micro-computer connected with the signal analyzer through a GP-IB interface. The time evolutions of the frequency spectrum of the density fluctuations are displayed on the CRT attached to the micro-computer and printed out on a printer-sheet. In the case of the data processing after experiments, the data stored on the floppy-disk of the signal analyzer are read out by using a floppy-disk unit attached to the micro-computer. After computation with the user-programs, the results, such as monitored signal, frequency spectra, wave number spectra and the time evolutions of the spectrum, are displayed and printed out. In this technical report, the system, the software and the directions for use are described. (author)

  5. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    International Nuclear Information System (INIS)

    Sommer, C. M.; Fritz, S.; Vollherbst, D.; Zelzer, S.; Wachter, M. F.; Bellemann, N.; Gockner, T.; Mokry, T.; Schmitz, A.; Aulmann, S.; Stampfl, U.; Pereira, P.; Kauczor, H. U.; Werner, J.; Radeleff, B. A.

    2015-01-01

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm 3 , and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm 3 , and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver

  6. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    Energy Technology Data Exchange (ETDEWEB)

    Sommer, C. M., E-mail: christof.sommer@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Fritz, S., E-mail: stefan.fritz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Vollherbst, D., E-mail: dominikvollherbst@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Zelzer, S., E-mail: s.zelzer@dkfz-heidelberg.de [German Cancer Research Center (dkfz), Medical and Biological Informatics (Germany); Wachter, M. F., E-mail: fredericwachter@googlemail.com; Bellemann, N., E-mail: nadine.bellemann@med.uni-heidelberg.de; Gockner, T., E-mail: theresa.gockner@med.uni-heidelberg.de; Mokry, T., E-mail: theresa.mokry@med.uni-heidelberg.de; Schmitz, A., E-mail: anne.schmitz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Aulmann, S., E-mail: sebastian.aulmann@mail.com [University Hospital Heidelberg, Department of General Pathology (Germany); Stampfl, U., E-mail: ulrike.stampfl@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Pereira, P., E-mail: philippe.pereira@slk-kliniken.de [SLK Kliniken Heilbronn GmbH, Clinic for Radiology, Minimally-invasive Therapies and Nuclear Medicine (Germany); Kauczor, H. U., E-mail: hu.kauczor@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Werner, J., E-mail: jens.werner@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Radeleff, B. A., E-mail: boris.radeleff@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany)

    2015-02-15

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm{sup 3}, and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm{sup 3}, and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver.

  7. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  8. Experiences of women with breast cancer: exchanging social support over the CHESS computer network.

    Science.gov (United States)

    Shaw, B R; McTavish, F; Hawkins, R; Gustafson, D H; Pingree, S

    2000-01-01

    Using an existential-phenomenological approach, this paper describes how women with breast cancer experience the giving and receiving of social support in a computer-mediated context. Women viewed their experiences with the computer-mediated support group as an additional and unique source of support in facing their illness. Anonymity within the support group fostered equalized participation and allowed women to communicate in ways that would have been more difficult in a face-to-face context. The asynchronous communication was a frustration to some participants, but some indicated that the format allowed for more thoughtful interaction. Motivations for seeking social support appeared to be a dynamic process, with a consistent progression from a position of receiving support to that of giving support. The primary benefits women received from participation in the group were communicating with other people who shared similar problems and helping others, which allowed them to change their focus from a preoccupation with their own sickness to thinking of others. Consistent with past research is the finding that women in this study expressed that social support is a multidimensional phenomenon and that their computer-mediated support group provided abundant emotional support, encouragement, and informational support. Excerpts from the phenomenological interviews are used to review and highlight key theoretical concepts from the research literatures on computer-mediated communication, social support, and the psychosocial needs of women with breast cancer.

  9. eCodonOpt: a systematic computational framework for optimizing codon usage in directed evolution experiments

    OpenAIRE

    Moore, Gregory L.; Maranas, Costas D.

    2002-01-01

    We present a systematic computational framework, eCodonOpt, for designing parental DNA sequences for directed evolution experiments through codon usage optimization. Given a set of homologous parental proteins to be recombined at the DNA level, the optimal DNA sequences encoding these proteins are sought for a given diversity objective. We find that the free energy of annealing between the recombining DNA sequences is a much better descriptor of the extent of crossover formation than sequence...

  10. Neural chips, neural computers and application in high and superhigh energy physics experiments

    International Nuclear Information System (INIS)

    Nikityuk, N.M.; )

    2001-01-01

    Architecture peculiarity and characteristics of series of neural chips and neural computes used in scientific instruments are considered. Tendency of development and use of them in high energy and superhigh energy physics experiments are described. Comparative data which characterize the efficient use of neural chips for useful event selection, classification elementary particles, reconstruction of tracks of charged particles and for search of hypothesis Higgs particles are given. The characteristics of native neural chips and accelerated neural boards are considered [ru

  11. Basic research and 12 years of clinical experience in computer-assisted navigation technology: a review.

    Science.gov (United States)

    Ewers, R; Schicho, K; Undt, G; Wanschitz, F; Truppe, M; Seemann, R; Wagner, A

    2005-01-01

    Computer-aided surgical navigation technology is commonly used in craniomaxillofacial surgery. It offers substantial improvement regarding esthetic and functional aspects in a range of surgical procedures. Based on augmented reality principles, where the real operative site is merged with computer generated graphic information, computer-aided navigation systems were employed, among other procedures, in dental implantology, arthroscopy of the temporomandibular joint, osteotomies, distraction osteogenesis, image guided biopsies and removals of foreign bodies. The decision to perform a procedure with or without computer-aided intraoperative navigation depends on the expected benefit to the procedure as well as on the technical expenditure necessary to achieve that goal. This paper comprises the experience gained in 12 years of research, development and routine clinical application. One hundred and fifty-eight operations with successful application of surgical navigation technology--divided into five groups--are evaluated regarding the criteria "medical benefit" and "technical expenditure" necessary to perform these procedures. Our results indicate that the medical benefit is likely to outweight the expenditure of technology with few exceptions (calvaria transplant, resection of the temporal bone, reconstruction of the orbital floor). Especially in dental implantology, specialized software reduces time and additional costs necessary to plan and perform procedures with computer-aided surgical navigation.

  12. An Analysis of Creative Process Learning in Computer Game Activities Through Player Experiences

    Directory of Open Access Journals (Sweden)

    Wilawan Inchamnan

    2016-09-01

    Full Text Available This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL. A behavior analysis for measuring the creative potential of computer game activities and learning outcomes is described. Creative components were measured by examining task motivation and domain-relevant and creativity-relevant skill factors. The research approach applied heuristic checklists in the field of gameplay to analyze the stage of player activities involved in the performance of the task and to examine player experiences with the Player Experience of Need Satisfaction (PENS survey. Player experiences were influenced by competency, autonomy, intuitive controls, relatedness and presence. This study examines the impact of these activities on the player experience for evaluating learning outcomes through school records. The study is designed to better understand the creative potential of people who are engaged in learning knowledge and skills during the course while playing video games. The findings show the creative potential that occurred to yield levels of creative performance within game play activities to support learning. The anticipated outcome is knowledge on how video games foster creative thinking as an overview of the Creative Potential of Learning Model (CPLN. CPLN clearly describes the interrelationships between principles of learning and creative potential, the interpretation of the results is indispensable.

  13. SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments

    Science.gov (United States)

    Leonard, R. F.

    1977-01-01

    A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.

  14. Estimation of subcriticality with the computed values analysis using MCNP of experiment on coupled cores

    International Nuclear Information System (INIS)

    Sakurai, Kiyoshi; Yamamoto, Toshihiro; Arakawa, Takuya; Naito, Yoshitaka

    1998-01-01

    Experiments on coupled cores performed at TCA were analysed using continuous energy Monte Carlo calculation code MCNP 4A. Errors of neutron multiplication factors are evaluated using Indirect Bias Estimation Method proposed by authors. Calculation for simulation of pulsed neutron method was performed for 17 X 17 + 5G + 17 x 17 core system and its of exponential experiment method was also performed for 16 x 9 + 3G + 16 x 9 and 16 x 9 + 5G + 16 x 9 core systems. Errors of neutron multiplication factors are estimated to be (-1.5) - (-0.6)% evaluated by Indirect Bias Estimation Method. Its errors evaluated by conventional pulsed neutron method and exponential experiment method are estimated to be 7%, but it is below 1% for estimation of subcriticality with the computed values by applying Indirect Bias Estimation Method. Feasibility of subcriticality management is higher by application of the method to full scale fuel strage facility. (author)

  15. Virtual machines & volunteer computing: Experience from LHC@Home: Test4Theory project

    CERN Document Server

    Lombraña González, Daniel; Blomer, Jakob; Buncic, Predrag; Harutyunyan, Artem; Marquina, Miguel; Segal, Ben; Skands, Peter; Karneyeu, Anton

    2012-01-01

    Volunteer desktop grids are nowadays becoming more and more powerful thanks to improved high end components: multi-core CPUs, larger RAM memories and hard disks, better network connectivity and bandwidth, etc. As a result, desktop grid systems can run more complex experiments or simulations, but some problems remain: the heterogeneity of hardware architectures and software (library dependencies, code length, big repositories, etc.) make it very difficult for researchers and developers to deploy and maintain a software stack for all the available platforms. In this paper, the employment of virtualization is shown to be the key to solve these problems. It provides a homogeneous layer allowing researchers to focus their efforts on running their experiments. Inside virtual custom execution environments, researchers can control and deploy very complex experiments or simulations running on heterogeneous grids of high-end computers. The following work presents the latest results from CERN’s LHC@home Test4Theory p...

  16. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2017-11-01

    Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  17. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Directory of Open Access Journals (Sweden)

    De Raedt Hans

    2017-11-01

    Full Text Available Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015; L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other “post-selection” is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell’s theorem which states that this is impossible. The failure of Bell’s theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  18. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    International Nuclear Information System (INIS)

    Varela Rodriguez, F

    2011-01-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  19. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    Science.gov (United States)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  20. The effects of nutrition labeling on consumer food choice: a psychological experiment and computational model.

    Science.gov (United States)

    Helfer, Peter; Shultz, Thomas R

    2014-12-01

    The widespread availability of calorie-dense food is believed to be a contributing cause of an epidemic of obesity and associated diseases throughout the world. One possible countermeasure is to empower consumers to make healthier food choices with useful nutrition labeling. An important part of this endeavor is to determine the usability of existing and proposed labeling schemes. Here, we report an experiment on how four different labeling schemes affect the speed and nutritional value of food choices. We then apply decision field theory, a leading computational model of human decision making, to simulate the experimental results. The psychology experiment shows that quantitative, single-attribute labeling schemes have greater usability than multiattribute and binary ones, and that they remain effective under moderate time pressure. The computational model simulates these psychological results and provides explanatory insights into them. This work shows how experimental psychology and computational modeling can contribute to the evaluation and improvement of nutrition-labeling schemes. © 2014 New York Academy of Sciences.

  1. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation?

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on a strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers. (orig.)

  2. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers

  3. Enabling the ATLAS Experiment at the LHC for High Performance Computing

    CERN Document Server

    AUTHOR|(CDS)2091107; Ereditato, Antonio

    In this thesis, I studied the feasibility of running computer data analysis programs from the Worldwide LHC Computing Grid, in particular large-scale simulations of the ATLAS experiment at the CERN LHC, on current general purpose High Performance Computing (HPC) systems. An approach for integrating HPC systems into the Grid is proposed, which has been implemented and tested on the „Todi” HPC machine at the Swiss National Supercomputing Centre (CSCS). Over the course of the test, more than 500000 CPU-hours of processing time have been provided to ATLAS, which is roughly equivalent to the combined computing power of the two ATLAS clusters at the University of Bern. This showed that current HPC systems can be used to efficiently run large-scale simulations of the ATLAS detector and of the detected physics processes. As a first conclusion of my work, one can argue that, in perspective, running large-scale tasks on a few large machines might be more cost-effective than running on relatively small dedicated com...

  4. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  5. Cloud Computing Technologies in Writing Class: Factors Influencing Students’ Learning Experience

    Directory of Open Access Journals (Sweden)

    Jenny WANG

    2017-07-01

    Full Text Available The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study addresses the implementation of the most commonly used cloud applications, Google Docs, in a higher education course. The learning environment integrated Google Docs that students are using to develop and deploy writing assignments in between classes has been subjected to learning experience assessment. Using the questionnaire as an instrument to study participants (n=28, the system has provided an effective learning environment in between classes for the students and the instructor to stay connected. Factors influencing students’ learning experience based on cloud applications include frequency of interaction online and students’ technology experience. Suggestions to cope with challenges regarding the use of them in higher education including the technical issues are also presented. Educators are therefore encouraged to embrace cloud computing technologies as they design the course curriculum in hoping to effectively enrich students’ learning.

  6. Use of VME computers for the data acquisition system of the PHOENICS experiment

    International Nuclear Information System (INIS)

    Zucht, B.

    1989-10-01

    The data acquisition program PHON (PHOENICS ONLINE) for the PHOENICS-experiment at the stretcher ring ELSA in Bonn is described. PHON is based on a fast parallel CAMAC readout with special VME-front-end-processors (VIP) and a VAX computer, allowing comfortable control and programming. Special tools have been developed to facilitate the implementation of user programs. The PHON-compiler allows to specify the arrangement of the CAMAC-modules to be read out for each event (camaclist) using a simple language. The camaclist is translated in 68000 Assembly and runs on the front-end-processors, making high data rates possible. User programs for monitoring and control of the experiment normally require low data rates and therefore run on the VAX computer. CAMAC operations are supported by the PHON CAMAC-Library. For graphic representation of the data the CERN standard program libraries HBOOK and PAW are used. The data acquisition system is very flexible and can be easily adapted to different experiments. (orig.)

  7. FOREIGN AND DOMESTIC EXPERIENCE OF INTEGRATING CLOUD COMPUTING INTO PEDAGOGICAL PROCESS OF HIGHER EDUCATIONAL ESTABLISHMENTS

    Directory of Open Access Journals (Sweden)

    Nataliia A. Khmil

    2016-01-01

    Full Text Available In the present article foreign and domestic experience of integrating cloud computing into pedagogical process of higher educational establishments (H.E.E. has been generalized. It has been stated that nowadays a lot of educational services are hosted in the cloud, e.g. infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. The peculiarities of implementing cloud technologies by H.E.E. in Ukraine and abroad have been singled out; the products developed by the leading IT companies for using cloud computing in higher education system, such as Microsoft for Education, Google Apps for Education and Amazon AWS Educate have been reviewed. The examples of concrete types, methods and forms of learning and research work based on cloud services have been provided.

  8. Experiments for the validation of computer codes uses to assess the protection factors afforded by dwellings

    International Nuclear Information System (INIS)

    Le Grand, J.; Roux, Y.; Kerlau, G.

    1988-09-01

    Two experimental campaigns were carried out to verify: 1) the method of assessing the mean kerma in a household used in the computer code BILL calculating the protection factor afforded by dwellings; 2) in what conditions the kerma calculated in cubic meshes of a given size (code PIECE) agreed with TLD measurements. To that purpose, a house was built near the caesium 137 source of the Ecosystem irradiator located at the Cadarache Nuclear Research Center. During the first campaign, four experiments with different house characteristics were conducted. Some 50 TLSs locations describing the inhabitable volume were defined in order to obtain the mean kerma. 16 locations were considered outside the house. During the second campaign a cobalt 60 source was installed on the side. Only five measurement locations were defined, each with 6 TLDs. The results of dosimetric measurements are presented and compared with the calculations of the two computer codes. The effects of wall heterogeneity were also studied [fr

  9. In the land of the dinosaurs, how to survive experience with building of midrange computing cluster

    Energy Technology Data Exchange (ETDEWEB)

    Chevel, A E [Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); Lauret, J [SUNY at Stony Brook (United States)

    2001-07-01

    The authors discuss how to put into operation a midrange computing cluster for the Nuclear Chemistry Group (NCG) of the Stage University of New York at STONY Brook (SUNY-SB). The NCG is part and one of the collaborators within the RHIC/Phenix experiment located at the Brookhaven National Laboratory (BNL). The Phenix detector system produces about half a PB (or 500 TB) of data a year and our goal was to provide to this remote collaborating facility the means to be part of the analysis process. The computing installation was put into operation at the beginning of the year 2000. The cluster consists of 32 peripheral machines running under Linux and central server Alpha 4100 under Digital Unix 4.0f (formally True Unix 64). The realization process is under discussion.

  10. In the land of the dinosaurs, how to survive experience with building of midrange computing cluster

    International Nuclear Information System (INIS)

    Chevel, A.E.; Lauret, J.

    2001-01-01

    The authors discuss how to put into operation a midrange computing cluster for the Nuclear Chemistry Group (NCG) of the Stage University of New York at STONY Brook (SUNY-SB). The NCG is part and one of the collaborators within the RHIC/Phenix experiment located at the Brookhaven National Laboratory (BNL). The Phenix detector system produces about half a PB (or 500 TB) of data a year and our goal was to provide to this remote collaborating facility the means to be part of the analysis process. The computing installation was put into operation at the beginning of the year 2000. The cluster consists of 32 peripheral machines running under Linux and central server Alpha 4100 under Digital Unix 4.0f (formally True Unix 64). The realization process is under discussion

  11. Electronics, trigger, data acquisition, and computing working group on future B physics experiments

    International Nuclear Information System (INIS)

    Geer, S.

    1993-01-01

    Electronics, trigger, data acquisition, and computing: this is a very broad list of topics. Nevertheless in a modern particle physics experiment one thinks in terms of a data pipeline in which the front end electronics, the trigger and data acquisition, and the offline reconstruction are linked together. In designing any piece of this pipeline it is necessary to understand the bigger picture of the data flow, data rates and volume, and the input rate, output rate, and latencies for each part of the pipeline. All of this needs to be developed with a clear understanding of the requirements imposed by the physics goals of the experiment; the signal efficiencies, background rates, and the amount of recorded information that needs to be propagated through the pipeline to select and analyse the events of interest. The technology needed to meet the demanding high data volume needs of the next round of B physics experiments appears to be available, now or within a couple of years. This seems to be the case for both fixed target and collider B physics experiments. Although there are many differences between the various data pipelines that are being proposed, there are also striking similarities. All experiments have a multi-level trigger scheme (most have levels 1, 2, and 3) where the final level consists of a computing farm that can run offline-type code and reduce the data volume by a factor of a few. Finally, the ability to reconstruct large data volumes offline in a reasonably short time, and making large data volumes available to many physicists for analysis, imposes severe constraints on the foreseen data pipelines, and a significant uncertainty in evaluating the various approaches proposed

  12. Robust flow stability: Theory, computations and experiments in near wall turbulence

    Science.gov (United States)

    Bobba, Kumar Manoj

    Helmholtz established the field of hydrodynamic stability with his pioneering work in 1868. From then on, hydrodynamic stability became an important tool in understanding various fundamental fluid flow phenomena in engineering (mechanical, aeronautics, chemical, materials, civil, etc.) and science (astrophysics, geophysics, biophysics, etc.), and turbulence in particular. However, there are many discrepancies between classical hydrodynamic stability theory and experiments. In this thesis, the limitations of traditional hydrodynamic stability theory are shown and a framework for robust flow stability theory is formulated. A host of new techniques like gramians, singular values, operator norms, etc. are introduced to understand the role of various kinds of uncertainty. An interesting feature of this framework is the close interplay between theory and computations. It is shown that a subset of Navier-Stokes equations are globally, non-nonlinearly stable for all Reynolds number. Yet, invoking this new theory, it is shown that these equations produce structures (vortices and streaks) as seen in the experiments. The experiments are done in zero pressure gradient transiting boundary layer on a flat plate in free surface tunnel. Digital particle image velocimetry, and MEMS based laser Doppler velocimeter and shear stress sensors have been used to make quantitative measurements of the flow. Various theoretical and computational predictions are in excellent agreement with the experimental data. A closely related topic of modeling, simulation and complexity reduction of large mechanics problems with multiple spatial and temporal scales is also studied. A nice method that rigorously quantifies the important scales and automatically gives models of the problem to various levels of accuracy is introduced. Computations done using spectral methods are presented.

  13. Emergent Power-Law Phase in the 2D Heisenberg Windmill Antiferromagnet: A Computational Experiment

    Science.gov (United States)

    Jeevanesan, Bhilahari; Chandra, Premala; Coleman, Piers; Orth, Peter P.

    2015-10-01

    In an extensive computational experiment, we test Polyakov's conjecture that under certain circumstances an isotropic Heisenberg model can develop algebraic spin correlations. We demonstrate the emergence of a multispin U(1) order parameter in a Heisenberg antiferromagnet on interpenetrating honeycomb and triangular lattices. The correlations of this relative phase angle are observed to decay algebraically at intermediate temperatures in an extended critical phase. Using finite-size scaling we show that both phase transitions are of the Berezinskii-Kosterlitz-Thouless type, and at lower temperatures we find long-range Z6 order.

  14. FELIX experiments and computational needs for eddy current analysis of fusion reactors

    International Nuclear Information System (INIS)

    Turner, L.R.

    1984-01-01

    In a fusion reactor, changing magnetic fields are closely coupled to the electrically-conducting metal structure. This coupling is particularly pronounced in a tokamak reactor in which magnetic fields are used to confine, stabilize, drive, and heat the plasma. Electromagnetic effects in future fusion reactors will have far-reaching implications in the configuration, operation, and maintenance of the reactors. This paper describes the impact of eddy-current effects on future reactors, the requirements of computer codes for analyzing those effects, and the FELIX experiments which will provide needed data for code validation

  15. A simple computational for the analysis of 2-D solute migration experiments

    International Nuclear Information System (INIS)

    Villar, Heldio Pereira

    1996-01-01

    A preliminary model for the simulation of 2-D migration patterns is presented. This computer model adopts a novel approach to the solution of the advection-dispersion equation in two dimensions through finite differences. The soil column is divided into a number of thin columns. The 1-D advection-dispersion equation is applied in the direction of flow and, using the same time increment, the 1-D diffusion equation is applied perpendicularly to the flow. The results thus obtained were compared to those of two migration experiments with two different soils. (author)

  16. Analysis of RELAP/SCDAPSIM/MOD3.2 Computer Code using QUENCH Experiments

    International Nuclear Information System (INIS)

    Honaiser, Eduardo; Anghaie, Samim

    2004-01-01

    The experiments QUENCH-01/06 were modelled using RELAP5/SCDAPSIM MOD3.2(bd) computer code. The results obtained from these models were compared to the experimental data to evaluate the code performance. The experiments were performed in the Forschungszentrum Karlsruhe (FZK), Germany. The objective of the experimental program was the investigation of the core behaviour during a severe accident, focusing on rod claddings overheat due to zirconium oxidation at high temperatures and due to the strong thermal gradient developed when the nuclear reactor core is flooded as part of an accident management measure. Temperatures histories and hydrogen production were compared. Molecular hydrogen is a product of the oxidation reaction, serving as a parameter to measure the oxidation reaction. After some model adjustments, good predictions were possible. The temperatures and the hydrogen production parameters stayed, most of the transient time, inside the uncertainty envelop. (authors)

  17. A distributed, graphical user interface based, computer control system for atomic physics experiments.

    Science.gov (United States)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  18. A distributed, graphical user interface based, computer control system for atomic physics experiments

    Science.gov (United States)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  19. Consideration of turbulent deposition in aerosol behaviour modelling with the CONTAIN code and comparison of the computations to sodium release experiments

    International Nuclear Information System (INIS)

    Jonas, R.

    1988-09-01

    CONTAIN is a computer code to analyze physical, chemical and radiological processes inside the reactor containment in the sequence of severe reactor accident. Modelling of the aerosol behaviour is included. We have improved the code by implementing a subroutine for turbulent deposition of aerosols. In contrast to previous calculations in which this effect was neglected, the computer results are in good agreement with sodium release experiments. If a typical friction velocity of 1 m/s is chosen, the computed aerosol mass median diameters and aerosol mass concentrations agree with the experimental results within a factor of 1.5 or 2, respectively. We have also found a good agreement between the CONTAIN calculations and results from other aerosol codes. (orig.) [de

  20. Recent Evolution of the Offline Computing Model of the NOvA Experiment

    Science.gov (United States)

    Habig, Alec; Norman, A.

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a νμ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. The current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.

  1. Recent Evolution of the Offline Computing Model of the NOvA Experiment

    International Nuclear Information System (INIS)

    Habig, Alec; Group, Craig; Norman, A.

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a ν μ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. The current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics. (paper)

  2. Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment

    Science.gov (United States)

    Hancock, Thomas M., III

    1999-01-01

    This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.

  3. Bayesian model calibration of computational models in velocimetry diagnosed dynamic compression experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hund, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Dynamic compression experiments are being performed on complicated materials using increasingly complex drivers. The data produced in these experiments are beginning to reach a regime where traditional analysis techniques break down; requiring the solution of an inverse problem. A common measurement in dynamic experiments is an interface velocity as a function of time, and often this functional output can be simulated using a hydrodynamics code. Bayesian model calibration is a statistical framework to estimate inputs into a computational model in the presence of multiple uncertainties, making it well suited to measurements of this type. In this article, we apply Bayesian model calibration to high pressure (250 GPa) ramp compression measurements in tantalum. We address several issues speci c to this calibration including the functional nature of the output as well as parameter and model discrepancy identi ability. Speci cally, we propose scaling the likelihood function by an e ective sample size rather than modeling the autocorrelation function to accommodate the functional output and propose sensitivity analyses using the notion of `modularization' to assess the impact of experiment-speci c nuisance input parameters on estimates of material properties. We conclude that the proposed Bayesian model calibration procedure results in simple, fast, and valid inferences on the equation of state parameters for tantalum.

  4. Three-dimensional turbulent swirling flow in a cylinder: Experiments and computations

    International Nuclear Information System (INIS)

    Gupta, Amit; Kumar, Ranganathan

    2007-01-01

    Dynamics of the three-dimensional flow in a cyclone with tangential inlet and tangential exit were studied using particle tracking velocimetry (PTV) and a three-dimensional computational model. The PTV technique is described in this paper and appears to be well suited for the current flow situation. The flow was helical in nature and a secondary recirculating flow was observed and well predicted by computations using the RNG k-ε turbulence model. The secondary flow was characterized by a single vortex which circulated around the axis and occupied a large fraction of the cylinder diameter. The locus of the vortex center meandered around the cylinder axis, making one complete revolution for a cylinder aspect ratio of 2. Tangential velocities from both experiments and computations were compared and found to be in good agreement. The general structure of the flow does not vary significantly as the Reynolds number is increased. However, slight changes in all components of velocity and pressure were seen as the inlet velocity is increased. By increasing the inlet aspect ratio it was observed that the vortex meandering changed significantly

  5. Three-dimensional turbulent swirling flow in a cylinder: Experiments and computations

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, Amit [Department of Mechanical, Materials and Aerospace Engineering, University of Central Florida, Orlando, FL 32816 (United States); Kumar, Ranganathan [Department of Mechanical, Materials and Aerospace Engineering, University of Central Florida, Orlando, FL 32816 (United States)]. E-mail: rnkumar@mail.ucf.edu

    2007-04-15

    Dynamics of the three-dimensional flow in a cyclone with tangential inlet and tangential exit were studied using particle tracking velocimetry (PTV) and a three-dimensional computational model. The PTV technique is described in this paper and appears to be well suited for the current flow situation. The flow was helical in nature and a secondary recirculating flow was observed and well predicted by computations using the RNG k-{epsilon} turbulence model. The secondary flow was characterized by a single vortex which circulated around the axis and occupied a large fraction of the cylinder diameter. The locus of the vortex center meandered around the cylinder axis, making one complete revolution for a cylinder aspect ratio of 2. Tangential velocities from both experiments and computations were compared and found to be in good agreement. The general structure of the flow does not vary significantly as the Reynolds number is increased. However, slight changes in all components of velocity and pressure were seen as the inlet velocity is increased. By increasing the inlet aspect ratio it was observed that the vortex meandering changed significantly.

  6. Computed tomography-guided core-needle biopsy of lung lesions: an oncology center experience

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Marcos Duarte; Fonte, Alexandre Calabria da; Chojniak, Rubens, E-mail: marcosduarte@yahoo.com.b [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Radiology and Imaging Diagnosis; Andrade, Marcony Queiroz de [Hospital Alianca, Salvador, BA (Brazil); Gross, Jefferson Luiz [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Chest Surgery

    2011-03-15

    Objective: The present study is aimed at describing the experience of an oncology center with computed tomography guided core-needle biopsy of pulmonary lesions. Materials and Methods: Retrospective analysis of 97 computed tomography-guided core-needle biopsy of pulmonary lesions performed in the period between 1996 and 2004 in a Brazilian reference oncology center (Hospital do Cancer - A.C. Camargo). Information regarding material appropriateness and the specific diagnoses were collected and analyzed. Results: Among 97 lung biopsies, 94 (96.9%) supplied appropriate specimens for histological analyses, with 71 (73.2%) cases being diagnosed as malignant lesions and 23 (23.7%) diagnosed as benign lesions. Specimens were inappropriate for analysis in three cases. The frequency of specific diagnosis was 83 (85.6%) cases, with high rates for both malignant lesions with 63 (88.7%) cases and benign lesions with 20 (86.7%). As regards complications, a total of 12 cases were observed as follows: 7 (7.2%) cases of hematoma, 3 (3.1%) cases of pneumothorax and 2 (2.1%) cases of hemoptysis. Conclusion: Computed tomography-guided core needle biopsy of lung lesions demonstrated high rates of material appropriateness and diagnostic specificity, and low rates of complications in the present study. (author)

  7. EXPERIENCE WITH FPGA-BASED PROCESSOR CORE AS FRONT-END COMPUTER

    International Nuclear Information System (INIS)

    HOFF, L.T.

    2005-01-01

    The RHIC control system architecture follows the familiar ''standard model''. LINUX workstations are used as operator consoles. Front-end computers are distributed around the accelerator, close to equipment being controlled or monitored. These computers are generally based on VMEbus CPU modules running the VxWorks operating system. I/O is typically performed via the VMEbus, or via PMC daughter cards (via an internal PCI bus), or via on-board I/O interfaces (Ethernet or serial). Advances in FPGA size and sophistication now permit running virtual processor ''cores'' within the FPGA logic, including ''cores'' with advanced features such as memory management. Such systems offer certain advantages over traditional VMEbus Front-end computers. Advantages include tighter coupling with FPGA logic, and therefore higher I/O bandwidth, and flexibility in packaging, possibly resulting in a lower noise environment and/or lower cost. This paper presents the experience acquired while porting the RHIC control system to a PowerPC 405 core within a Xilinx FPGA for use in low-level RF control

  8. Fabrication Improvement of Cold Forging Hexagonal Nuts by Computational Analysis and Experiment Verification

    Directory of Open Access Journals (Sweden)

    Shao-Yi Hsia

    2015-01-01

    Full Text Available Cold forging has played a critical role in fasteners and has been applied to the automobile industry, construction industry, aerospace industry, and living products so that cold forging presents the opportunities for manufacturing more products. By using computer simulation, this study attempts to analyze the process of creating machine parts, such as hexagonal nuts. The DEFORM-3D forming software is applied to analyze the process at various stages in the computer simulation, and the compression test is also used for the flow stress equation in order to compare the differences between the experimental results and the equation that is built into the computer simulation software. At the same time, the metallography and hardness of experiments are utilized to understand the cold forging characteristics of hexagonal nuts. The research results would benefit machinery businesses to realize the forging load and forming conditions at various stages before the fastener formation. In addition to planning proper die design and production, the quality of the produced hexagonal nuts would be more stable to promote industrial competitiveness.

  9. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  10. Control and management unit for a computation platform at the PANDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Galuska, Martin; Gessler, Thomas; Kuehn, Wolfgang; Lang, Johannes; Lange, Jens Soeren; Liang, Yutie; Liu, Ming; Spruck, Bjoern; Wang, Qiang [II. Physikalisches Institut, Justus-Liebig-Universitaet Giessen (Germany)

    2010-07-01

    The FAIR facility will provide high intensity antiproton and heavy ion beams for the PANDA and HADES experiments, leading to very high reaction rates. PANDA is expected to run at 10-20 MHz with a raw data output rate of up to 200 GB/s. A sophisticated data acquisition system is needed in order to select physically relevant events online. For this purpose a network of interconnected compute nodes can be used. Each compute node can be programmed to run various algorithms, such as online particle track recognition for high level triggering. An ATCA communication shelf provides power, cooling and high-speed interconnections to up to 14 nodes. A single shelf manager supervises and regulates the power distribution and temperature inside the shelf. The shelf manager relies on a local control chip on each node to relay sensor read-outs, provide hardware adresses and power requirements etc. An IPM controller based on an Atmel microcontroller was designed for this purpose, and a prototype was produced. The neccessary software is being developed to allow local communication with the components of the compute node and remote communication with the shelf manager conform to the ATCA specification.

  11. Interpolation Environment of Tensor Mathematics at the Corpuscular Stage of Computational Experiments in Hydromechanics

    Science.gov (United States)

    Bogdanov, Alexander; Degtyarev, Alexander; Khramushin, Vasily; Shichkina, Yulia

    2018-02-01

    Stages of direct computational experiments in hydromechanics based on tensor mathematics tools are represented by conditionally independent mathematical models for calculations separation in accordance with physical processes. Continual stage of numerical modeling is constructed on a small time interval in a stationary grid space. Here coordination of continuity conditions and energy conservation is carried out. Then, at the subsequent corpuscular stage of the computational experiment, kinematic parameters of mass centers and surface stresses at the boundaries of the grid cells are used in modeling of free unsteady motions of volume cells that are considered as independent particles. These particles can be subject to vortex and discontinuous interactions, when restructuring of free boundaries and internal rheological states has place. Transition from one stage to another is provided by interpolation operations of tensor mathematics. Such interpolation environment formalizes the use of physical laws for mechanics of continuous media modeling, provides control of rheological state and conditions for existence of discontinuous solutions: rigid and free boundaries, vortex layers, their turbulent or empirical generalizations.

  12. Unraveling the electrolyte properties of Na3SbS4 through computation and experiment

    Science.gov (United States)

    Rush, Larry E.; Hood, Zachary D.; Holzwarth, N. A. W.

    2017-12-01

    Solid-state sodium electrolytes are expected to improve next-generation batteries on the basis of favorable energy density and reduced cost. Na3SbS4 represents a new solid-state ion conductor with high ionic conductivities in the mS/cm range. Here, we explore the tetragonal phase of Na3SbS4 and its interface with metallic sodium anode using a combination of experiments and first-principles calculations. The computed Na-ion vacancy migration energies of 0.1 eV are smaller than the value inferred from experiment, suggesting that grain boundaries or other factors dominate the experimental systems. Analysis of symmetric cells of the electrolyte—Na/Na 3SbS4/Na —show that a conductive solid electrolyte interphase forms. Computer simulations infer that the interface is likely to be related to Na3SbS3 , involving the conversion of the tetrahedral SbS43 - ions of the bulk electrolyte into trigonal pyramidal SbS33 - ions at the interface.

  13. O2: A novel combined online and offline computing system for the ALICE Experiment after 2018

    International Nuclear Information System (INIS)

    Ananya; Agrawal, N; Avasthi, A; Suaide, A Alarcon Do Passo; Prado, C Alves Garcia; Alt, T; Bach, M; Breitner, T; Aphecetche, L; Bala, R; Bhasin, A; Barnafoldi, G; Belikov, J; Bellini, F; Betev, L; Buncic, P; Carena, F; Carena, W; Chapeland, S; Barroso, V Chibante

    2014-01-01

    ALICE (A Large Ion Collider Experiment) is a detector dedicated to the studies with heavy ion collisions exploring the physics of strongly interacting nuclear matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shutdown of the LHC, the ALICE Experiment will be upgraded to make high precision measurements of rare probes at low pT, which cannot be selected with a trigger, and therefore require a very large sample of events recorded on tape. The online computing system will be completely redesigned to address the major challenge of sampling the full 50 kHz Pb-Pb interaction rate increasing the present limit by a factor of 100. This upgrade will also include the continuous un-triggered read-out of two detectors: ITS (Inner Tracking System) and TPC (Time Projection Chamber)) producing a sustained throughput of 1 TB/s. This unprecedented data rate will be reduced by adopting an entirely new strategy where calibration and reconstruction are performed online, and only the reconstruction results are stored while the raw data are discarded. This system, already demonstrated in production on the TPC data since 2011, will be optimized for the online usage of reconstruction algorithms. This implies much tighter coupling between online and offline computing systems. An R and D program has been set up to meet this huge challenge. The object of this paper is to present this program and its first results.

  14. Computer Experiences, Self-Efficacy and Knowledge of Students Enrolled in Introductory University Agriculture Courses.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.

    1999-01-01

    Of 175 freshmen agriculture students, 74% had prior computer courses, 62% owned computers. The number of computer topics studied predicted both computer self-efficacy and computer knowledge. A substantial positive correlation was found between self-efficacy and computer knowledge. (SK)

  15. The Influence of Trainee Gaming Experience and Computer Self-Efficacy on Learner Outcomes of Videogame-Based Learning Environments

    National Research Council Canada - National Science Library

    Orvis, Karin A; Orvis, Kara L; Belanich, James; Mullin, Laura N

    2005-01-01

    .... The purpose of the current research was to investigate the influence of two trainee characteristics, prior videogame experience and computer self-efficacy, on learner outcomes of a videogame-based training environment...

  16. Research and Teaching: Computational Methods in General Chemistry--Perceptions of Programming, Prior Experience, and Student Outcomes

    Science.gov (United States)

    Wheeler, Lindsay B.; Chiu, Jennie L.; Grisham, Charles M.

    2016-01-01

    This article explores how integrating computational tools into a general chemistry laboratory course can influence student perceptions of programming and investigates relationships among student perceptions, prior experience, and student outcomes.

  17. Experiment Dashboard - a generic, scalable solution for monitoring of the LHC computing activities, distributed sites and services

    International Nuclear Information System (INIS)

    Andreeva, J; Cinquilli, M; Dieguez, D; Dzhunov, I; Karavakis, E; Karhula, P; Kenyon, M; Kokoszkiewicz, L; Nowotka, M; Ro, G; Saiz, P; Tuckett, D; Sargsyan, L; Schovancova, J

    2012-01-01

    The Experiment Dashboard system provides common solutions for monitoring job processing, data transfers and site/service usability. Over the last seven years, it proved to play a crucial role in the monitoring of the LHC computing activities, distributed sites and services. It has been one of the key elements during the commissioning of the distributed computing systems of the LHC experiments. The first years of data taking represented a serious test for Experiment Dashboard in terms of functionality, scalability and performance. And given that the usage of the Experiment Dashboard applications has been steadily increasing over time, it can be asserted that all the objectives were fully accomplished.

  18. More Ideas for Monitoring Biological Experiments with the BBC Computer: Absorption Spectra, Yeast Growth, Enzyme Reactions and Animal Behaviour.

    Science.gov (United States)

    Openshaw, Peter

    1988-01-01

    Presented are five ideas for A-level biology experiments using a laboratory computer interface. Topics investigated include photosynthesis, yeast growth, animal movements, pulse rates, and oxygen consumption and production by organisms. Includes instructions specific to the BBC computer system. (CW)

  19. Computer-based testing of the modified essay question: the Singapore experience.

    Science.gov (United States)

    Lim, Erle Chuen-Hian; Seet, Raymond Chee-Seong; Oh, Vernon M S; Chia, Boon-Lock; Aw, Marion; Quak, Seng-Hock; Ong, Benjamin K C

    2007-11-01

    The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall. Although it is traditionally conducted as a pen-and-paper examination, our university has run the MEQ using computer-based testing (CBT) since 2003. We describe our experience with running the MEQ examination using the IVLE, or integrated virtual learning environment (https://ivle.nus.edu.sg), provide a blueprint for universities intending to conduct computer-based testing of the MEQ, and detail how our MEQ examination has evolved since its inception. An MEQ committee, comprising specialists in key disciplines from the departments of Medicine and Paediatrics, was formed. We utilized the IVLE, developed for our university in 1998, as the online platform on which we ran the MEQ. We calculated the number of man-hours (academic and support staff) required to run the MEQ examination, using either a computer-based or pen-and-paper format. With the support of our university's information technology (IT) specialists, we have successfully run the MEQ examination online, twice a year, since 2003. Initially, we conducted the examination with short-answer questions only, but have since expanded the MEQ examination to include multiple-choice and extended matching questions. A total of 1268 man-hours was spent in preparing for, and running, the MEQ examination using CBT, compared to 236.5 man-hours to run it using a pen-and-paper format. Despite being more labour-intensive, our students and staff prefer CBT to the pen-and-paper format. The MEQ can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. We hope to increase the number of questions and incorporate audio and video files, featuring clinical vignettes, to the MEQ examination in the near future.

  20. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    Science.gov (United States)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We

  1. Enabling systematic, harmonised and large-scale biofilms data computation: the Biofilms Experiment Workbench.

    Science.gov (United States)

    Pérez-Rodríguez, Gael; Glez-Peña, Daniel; Azevedo, Nuno F; Pereira, Maria Olívia; Fdez-Riverola, Florentino; Lourenço, Anália

    2015-03-01

    Biofilms are receiving increasing attention from the biomedical community. Biofilm-like growth within human body is considered one of the key microbial strategies to augment resistance and persistence during infectious processes. The Biofilms Experiment Workbench is a novel software workbench for the operation and analysis of biofilms experimental data. The goal is to promote the interchange and comparison of data among laboratories, providing systematic, harmonised and large-scale data computation. The workbench was developed with AIBench, an open-source Java desktop application framework for scientific software development in the domain of translational biomedicine. Implementation favours free and open-source third-parties, such as the R statistical package, and reaches for the Web services of the BiofOmics database to enable public experiment deposition. First, we summarise the novel, free, open, XML-based interchange format for encoding biofilms experimental data. Then, we describe the execution of common scenarios of operation with the new workbench, such as the creation of new experiments, the importation of data from Excel spreadsheets, the computation of analytical results, the on-demand and highly customised construction of Web publishable reports, and the comparison of results between laboratories. A considerable and varied amount of biofilms data is being generated, and there is a critical need to develop bioinformatics tools that expedite the interchange and comparison of microbiological and clinical results among laboratories. We propose a simple, open-source software infrastructure which is effective, extensible and easy to understand. The workbench is freely available for non-commercial use at http://sing.ei.uvigo.es/bew under LGPL license. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. COMPUTER-BASED SYSTEMS OF PHYSICAL EXPERIMENT IN INDEPENDENT WORK OF STUDENTS OF TECHNICAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Iryna Slipukhina

    2016-11-01

    Full Text Available Purpose: The self-study activity of students is an important form of educational process under the conditions of rapid changes of technologies. Ability and readiness of future engineers for independent education is one of their key competences. Investigation of modern methods of planning, organization and control of independent cognitive activity of students while studying physics as effective means of complex forming of their professional qualities is the object of the research. Methods: We analyse the curricula of some engineering specialities in leading technical universities, existent methods and forms of organization of students’ self-study, and own pedagogical experience. Results: Based on the theoretical analysis of existing methods of students’ self-study, it was found that a systematizing factor of appropriate educational technology is the problem focused cognitive tasks. They have to be implemented by application of the modern technological devices integrated with a computer-based experiment. We define the aim of individual or group laboratory works; the necessary theoretical and practical knowledge and skills of students are rationalized; timing and form of presentation of the results are clarified after individual and group consulting. The details of preparatory, searching-organizational, operational, and control stages in organization of students’ self-study with the use of computer oriented physical experiment are specified, these details differ depending on the didactic purpose, form of organization and students’ individuality. Discussion: The research theoretical aspect confirms the determining role of subject-subject cooperation in forming of competences of independent learning of the future engineers. Basic practical achievements of the research consist of improving methods of using of digital learning systems, creation of textbooks that promote consultative and guiding role for the educational process, working-out of

  3. Robotics as an integration subject in the computer science university studies. The experience of the University of Almeria

    Directory of Open Access Journals (Sweden)

    Manuela Berenguel Soria

    2012-11-01

    Full Text Available This work presents a global view of the role of robotics in computer science studies, mainly in university degrees. The main motivation of the use of robotics in these studies deals with the following issues: robotics permits to put in practice many computer science fundamental topics, it is a multidisciplinary area which allows to complete the basic knowledge of any computer science student, it facilitates the practice and learning of basic competences of any engineer (for instance, teamwork, and there is a wide market looking for people with robotics knowledge. These ideas are discussed from our own experience in the University of Almeria acquired through the studies of Computer Science Technical Engineering, Computer Science Engineering, Computer Science Degree and Computer Science Postgraduate.

  4. Computer-assisted comparison of analysis and test results in transportation experiments

    International Nuclear Information System (INIS)

    Knight, R.D.; Ammerman, D.J.; Koski, J.A.

    1998-01-01

    As a part of its ongoing research efforts, Sandia National Laboratories' Transportation Surety Center investigates the integrity of various containment methods for hazardous materials transport, subject to anomalous structural and thermal events such as free-fall impacts, collisions, and fires in both open and confined areas. Since it is not possible to conduct field experiments for every set of possible conditions under which an actual transportation accident might occur, accurate modeling methods must be developed which will yield reliable simulations of the effects of accident events under various scenarios. This requires computer software which is capable of assimilating and processing data from experiments performed as benchmarks, as well as data obtained from numerical models that simulate the experiment. Software tools which can present all of these results in a meaningful and useful way to the analyst are a critical aspect of this process. The purpose of this work is to provide software resources on a long term basis, and to ensure that the data visualization capabilities of the Center keep pace with advancing technology. This will provide leverage for its modeling and analysis abilities in a rapidly evolving hardware/software environment

  5. Computational modeling of z-pinch-driven hohlraum experiments on Z

    International Nuclear Information System (INIS)

    Vesey, R.A.; Porter, J.L. Jr.; Cuneo, M.E.

    1999-01-01

    The high-yield inertial confinement fusion concept based on a double-ended z-pinch driven hohlraum tolerates the degree of spatial inhomogeneity present in z-pinch plasma radiation sources by utilizing a relatively large hohlraum wall surface to provide spatial smoothing of the radiation delivered to the fusion capsule. The z-pinch radiation sources are separated from the capsule by radial spoke arrays. Key physics issues for this concept are the behavior of the spoke array (effect on the z-pinch performance, x-ray transmission) and the uniformity of the radiation flux incident on the surface of the capsule. Experiments are underway on the Z accelerator at Sandia National laboratories to gain understanding of these issues in a single-sided drive geometry. These experiments seek to measure the radiation coupling among the z-pinch, source hohlraum, and secondary hohlraum, as well as the uniformity of the radiation flux striking a foam witness ball diagnostic positioned in the secondary hohlraum. This paper will present the results of computational modeling of various aspects of these experiments

  6. Investigation of Coal-biomass Catalytic Gasification using Experiments, Reaction Kinetics and Computational Fluid Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, Francine [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Agblevor, Foster [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Klein, Michael [Univ. of Delaware, Newark, DE (United States); Sheikhi, Reza [Northeastern Univ., Boston, MA (United States)

    2015-12-31

    A collaborative effort involving experiments, kinetic modeling, and computational fluid dynamics (CFD) was used to understand co-gasification of coal-biomass mixtures. The overall goal of the work was to determine the key reactive properties for coal-biomass mixed fuels. Sub-bituminous coal was mixed with biomass feedstocks to determine the fluidization and gasification characteristics of hybrid poplar wood, switchgrass and corn stover. It was found that corn stover and poplar wood were the best feedstocks to use with coal. The novel approach of this project was the use of a red mud catalyst to improve gasification and lower gasification temperatures. An important results was the reduction of agglomeration of the biomass using the catalyst. An outcome of this work was the characterization of the chemical kinetics and reaction mechanisms of the co-gasification fuels, and the development of a set of models that can be integrated into other modeling environments. The multiphase flow code, MFIX, was used to simulate and predict the hydrodynamics and co-gasification, and results were validated with the experiments. The reaction kinetics modeling was used to develop a smaller set of reactions for tractable CFD calculations that represented the experiments. Finally, an efficient tool was developed, MCHARS, and coupled with MFIX to efficiently simulate the complex reaction kinetics.

  7. Investigation of the Feasibility of Utilizing Gamma Emission Computed Tomography in Evaluating Fission Product Migration in Irradiated TRISO Fuel Experiments

    International Nuclear Information System (INIS)

    Harp, Jason M.; Demkowicz, Paul A.

    2014-01-01

    In the High Temperature Gas-Cooled Reactor (HTGR) the TRISO particle fuel serves as the primary fission product containment. However the large number of TRISO particles present in proposed HTGRs dictates that there will be a small fraction (~10"-"4 to 10"-"5) of as manufactured defects and in-pile particle failures that will lead to some fission product release. The matrix material surrounding the TRISO particles in fuel compacts and the structural graphite holding the TRISO particles in place can also serve as sinks for containing any released fission products. However data on the migration of solid fission products through these materials is lacking. One of the primary goals of the AGR-3/4 experiment is to study fission product migration from intentionally failed TRISO particles in prototypic HTGR components such as structural graphite and compact matrix material. In this work, the potential for a Gamma Emission Computed Tomography (GECT) technique to non-destructively examine the fission product distribution in AGR-3/4 components and other irradiation experiments is explored. Specifically, the feasibility of using the Idaho National Laboratory (INL) Hot Fuels Examination Facility (HFEF) Precision Gamma Scanner (PGS) system for this GECT application was considered. Previous experience utilizing similar techniques, the expected activities in AGR-3/4 rings, and analysis of this work indicate using GECT to evaluate AGR-3/4 will be feasible. The GECT technique was also applied to other irradiated nuclear fuel systems currently available in the HFEF hot cell, including oxide fuel pins, metallic fuel pins, and monolithic plate fuel. Results indicate GECT with the HFEF PGS is effective. (author)

  8. Scalability Dilemma and Statistic Multiplexed Computing — A Theory and Experiment

    Directory of Open Access Journals (Sweden)

    Justin Yuan Shi

    2017-08-01

    Full Text Available The For the last three decades, end-to-end computing paradigms, such as MPI (Message Passing Interface, RPC (Remote Procedure Call and RMI (Remote Method Invocation, have been the de facto paradigms for distributed and parallel programming. Despite of the successes, applications built using these paradigms suffer due to the proportionality factor of crash in the application with its size. Checkpoint/restore and backup/recovery are the only means to save otherwise lost critical information. The scalability dilemma is such a practical challenge that the probability of the data losses increases as the application scales in size. The theoretical significance of this practical challenge is that it undermines the fundamental structure of the scientific discovery process and mission critical services in production today. In 1997, the direct use of end-to-end reference model in distributed programming was recognized as a fallacy. The scalability dilemma was predicted. However, this voice was overrun by the passage of time. Today, the rapidly growing digitized data demands solving the increasingly critical scalability challenges. Computing architecture scalability, although loosely defined, is now the front and center of large-scale computing efforts. Constrained only by the economic law of diminishing returns, this paper proposes a narrow definition of a Scalable Computing Service (SCS. Three scalability tests are also proposed in order to distinguish service architecture flaws from poor application programming. Scalable data intensive service requires additional treatments. Thus, the data storage is assumed reliable in this paper. A single-sided Statistic Multiplexed Computing (SMC paradigm is proposed. A UVR (Unidirectional Virtual Ring SMC architecture is examined under SCS tests. SMC was designed to circumvent the well-known impossibility of end-to-end paradigms. It relies on the proven statistic multiplexing principle to deliver reliable service

  9. ATLAS Distributed Computing Experience and Performance During the LHC Run-2

    Science.gov (United States)

    Filipčič, A.; ATLAS Collaboration

    2017-10-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of the new model was demonstrated through the delivery of analysis datasets to users just one week after data taking, by completing the calibration loop, Tier-0 processing and train production steps promptly. The great flexibility of the new system also makes it possible to execute part of the Tier-0 processing on the grid when Tier-0 resources experience a backlog during high data-taking periods. The introduction of the data lifetime model, where each dataset is assigned a finite lifetime (with extensions possible for frequently accessed data), was made possible by Rucio. Thanks to this the storage crises experienced in Run-1 have not reappeared during Run-2. In addition, the distinction between Tier-1 and Tier-2 disk storage, now largely artificial given the quality of Tier-2 resources and their networking, has been removed through the introduction of dynamic ATLAS clouds that group the storage endpoint nucleus and its close-by execution satellite sites. All stable

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  12. Experience in nuclear materials accountancy, including the use of computers, in the UKAEA

    International Nuclear Information System (INIS)

    Anderson, A.R.; Adamson, A.S.; Good, P.T.; Terrey, D.R.

    1976-01-01

    The UKAEA have operated systems of nuclear materials accountancy in research and development establishments handling large quantities of material for over 20 years. In the course of that time changing requirements for nuclear materials control and increasing quantities of materials have required that accountancy systems be modified and altered to improve either the fundamental system or manpower utilization. The same accountancy principles are applied throughout the Authority but procedures at the different establishments vary according to the nature of their specific requirements; there is much in the cumulative experience of the UKAEA which could prove of value to other organizations concerned with nuclear materials accountancy or safeguards. This paper reviews the present accountancy system in the UKAEA and summarizes its advantages. Details are given of specific experience and solutions which have been found to overcome difficulties or to strengthen previous weak points. Areas discussed include the use of measurements, the establishment of measurement points (which is relevant to the designation of MBAs), the importance of regular physical stock-taking, and the benefits stemming from the existence of a separate accountancy section independent of operational management at large establishments. Some experience of a dual system of accountancy and criticality control is reported, and the present status of computerization of nuclear material accounts is summarized. Important aspects of the relationship between management systems of accountancy and safeguards' requirements are discussed briefly. (author)

  13. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Directory of Open Access Journals (Sweden)

    C Victor Jongeneel

    2017-06-01

    Full Text Available The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  14. Computer experiments of the time-sequence of individual steps in multiple Coulomb-excitation

    International Nuclear Information System (INIS)

    Boer, J. de; Dannhaueser, G.

    1982-01-01

    The way in which the multiple E2 steps in the Coulomb-excitation of a rotational band of a nucleus follow one another is elucidated for selected examples using semiclassical computer experiments. The role a given transition plays for the excitation of a given final state is measured by a quantity named ''importance function''. It is found that these functions, calculated for the highest rotational state, peak at times forming a sequence for the successive E2 transitions starting from the ground state. This sequential behaviour is used to approximately account for the effects on the projectile orbit of the sequential transfer of excitation energy and angular momentum from projectile to target. These orbits lead to similar deflection functions and cross sections as those obtained from a symmetrization procedure approximately accounting for the transfer of angular momentum and energy. (Auth.)

  15. Preliminary analysis of the MER magnetic properties experiment using a computational fluid dynamics model

    DEFF Research Database (Denmark)

    Kinch, K.M.; Merrison, J.P.; Gunnlaugsson, H.P.

    2006-01-01

    Motivated by questions raised by the magnetic properties experiments on the NASA Mars Pathfinder and Mars Exploration Rover (MER) missions, we have studied in detail the capture of airborne magnetic dust by permanent magnets using a computational fluid dynamics (CFD) model supported by laboratory...... simulations. The magnets studied are identical to the capture magnet and filter magnet on MER, though results are more generally applicable. The dust capture process is found to be dependent upon wind speed, dust magnetization, dust grain size and dust grain mass density. Here we develop an understanding...... of how these parameters affect dust capture rates and patterns on the magnets and set bounds for these parameters based on MER data and results from the numerical model. This results in a consistent picture of the dust as containing varying amounts of at least two separate components with different...

  16. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    International Nuclear Information System (INIS)

    Nash, T.

    1989-05-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC systems, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described. 6 figs

  17. The ground support computer and in-orbit survey data analysis program for the SEEP experiment

    International Nuclear Information System (INIS)

    Voss, H.D.; Datlowe, D.W.; Mobilia, J.; Roselle, S.N.

    1985-01-01

    The ground support computer equipment (GSE) and production survey plot and analysis software are described for the Stimulated Emissions of Energetic Particles (SEEP) experiment on the S81-1 satellite. A general purpose satellite data acquisition circuit was developed based on a Z-80 portable microcomputer. By simply changing instrument control software and electrical connectors, automatic testing and control of the various SEEP instruments was accomplished. A new feature incorporated into the SEEP data analysis phase was the development of a correlative data base for all of the SEEP instruments. A CPU efficient survey plot program (with ephemeris) was developed to display the approximate 3100 hours of data, with a time resolution of 0.5 sec, from the ten instrument sensors. The details of the general purpose multigraph algorithms and plot formats are presented. For the first time new associations are being investigated of simultaneous particle, X-ray, optical and plasma density satellite measurements

  18. A summary of computational experience at GE Aircraft Engines for complex turbulent flows in gas turbines

    Science.gov (United States)

    Zerkle, Ronald D.; Prakash, Chander

    1995-01-01

    This viewgraph presentation summarizes some CFD experience at GE Aircraft Engines for flows in the primary gaspath of a gas turbine engine and in turbine blade cooling passages. It is concluded that application of the standard k-epsilon turbulence model with wall functions is not adequate for accurate CFD simulation of aerodynamic performance and heat transfer in the primary gas path of a gas turbine engine. New models are required in the near-wall region which include more physics than wall functions. The two-layer modeling approach appears attractive because of its computational complexity. In addition, improved CFD simulation of film cooling and turbine blade internal cooling passages will require anisotropic turbulence models. New turbulence models must be practical in order to have a significant impact on the engine design process. A coordinated turbulence modeling effort between NASA centers would be beneficial to the gas turbine industry.

  19. The COSIMA-experiments, a data base for validation of two-phase flow computer codes

    International Nuclear Information System (INIS)

    Class, G.; Meyder, R.; Stratmanns, E.

    1985-12-01

    The report presents an overview on the large data base generated with COSIMA. The data base is to be used to validate and develop computer codes for two-phase flow. In terms of fuel rod behavior it was found that during blowdown under realistic conditions only small strains are reached. For clad rupture extremely high rod internal pressure is necessary. Additionally important results were found in the behavior of a fuel rod simulator and on the effect of thermocouples attached on the cladding outer surface. Post-test calculations, performed with the codes RELAP and DRUFAN show a good agreement with the experiments. This however can be improved if the phase separation models in the codes would be updated. (orig./HP) [de

  20. Assessing computational genomics skills: Our experience in the H3ABioNet African bioinformatics network.

    Science.gov (United States)

    Jongeneel, C Victor; Achinike-Oduaran, Ovokeraye; Adebiyi, Ezekiel; Adebiyi, Marion; Adeyemi, Seun; Akanle, Bola; Aron, Shaun; Ashano, Efejiro; Bendou, Hocine; Botha, Gerrit; Chimusa, Emile; Choudhury, Ananyo; Donthu, Ravikiran; Drnevich, Jenny; Falola, Oluwadamila; Fields, Christopher J; Hazelhurst, Scott; Hendry, Liesl; Isewon, Itunuoluwa; Khetani, Radhika S; Kumuthini, Judit; Kimuda, Magambo Phillip; Magosi, Lerato; Mainzer, Liudmila Sergeevna; Maslamoney, Suresh; Mbiyavanga, Mamana; Meintjes, Ayton; Mugutso, Danny; Mpangase, Phelelani; Munthali, Richard; Nembaware, Victoria; Ndhlovu, Andrew; Odia, Trust; Okafor, Adaobi; Oladipo, Olaleye; Panji, Sumir; Pillay, Venesa; Rendon, Gloria; Sengupta, Dhriti; Mulder, Nicola

    2017-06-01

    The H3ABioNet pan-African bioinformatics network, which is funded to support the Human Heredity and Health in Africa (H3Africa) program, has developed node-assessment exercises to gauge the ability of its participating research and service groups to analyze typical genome-wide datasets being generated by H3Africa research groups. We describe a framework for the assessment of computational genomics analysis skills, which includes standard operating procedures, training and test datasets, and a process for administering the exercise. We present the experiences of 3 research groups that have taken the exercise and the impact on their ability to manage complex projects. Finally, we discuss the reasons why many H3ABioNet nodes have declined so far to participate and potential strategies to encourage them to do so.

  1. Computational experiences with variable modulus, elastic-plastic, and viscoelastic concrete models

    International Nuclear Information System (INIS)

    Anderson, C.A.

    1981-01-01

    Six years ago the Reactor Safety Research Division of the Nuclear Regulatory Commission (NRC) approached the Los Alamos National Laboratory to develop a comprehensive concrete structural analysis code to predict the static and dynamic behavior of Prestressed Concrete Reactor Vessels (PCRVs) that serve as the containment structure of a High-Temperature Gas-Cooled Reactor. The PCRV is a complex concrete structure that must be modeled in three dimensions and posseses other complicating features such as a steel liner for the reactor cavity and woven cables embedded vertically in the PCRV and wound circumferentially on the outside of the PCRV. The cables, or tendons, are used for prestressing the reactor vessel. In addition to developing the computational capability to predict inelastic three dimensional concrete structural behavior, the code response was verified against documented experiments on concrete structural behavior. This code development/verification effort is described

  2. Experiments and computations on coaxial swirling jets with centerbody in an axisymmetric combustor

    International Nuclear Information System (INIS)

    Chao, Y.C.; Ho, W.C.; Lin, S.K.

    1987-01-01

    Experiments and computations of turbulent, confined, coannular swirling flows have been performed in a model combustor. Numerical results are obtained by means of a revised two-equation model of turbulence. The combustor consists of two confined, concentric, swirling jets and a centerbody at the center of the inlet. Results are reported for cold flow conditions under co- and counter-swirl. The numerical results agree with the experimental data under both conditions. The size of the central recirculation zone is dominated by the strength of the outer swirl. A two-cell recirculation zone may be formed due to the presence of the swirler hub. The mechanism of interaction between the separation bubble at the hub of the swirler and the central recirculation zone due to vortex breakdown is also investigated. 18 references

  3. Inequality measures perform differently in global and local assessments: An exploratory computational experiment

    Science.gov (United States)

    Chiang, Yen-Sheng

    2015-11-01

    Inequality measures are widely used in both the academia and public media to help us understand how incomes and wealth are distributed. They can be used to assess the distribution of a whole society-global inequality-as well as inequality of actors' referent networks-local inequality. How different is local inequality from global inequality? Formalizing the structure of reference groups as a network, the paper conducted a computational experiment to see how the structure of complex networks influences the difference between global and local inequality assessed by a selection of inequality measures. It was found that local inequality tends to be higher than global inequality when population size is large; network is dense and heterophilously assorted, and income distribution is less dispersed. The implications of the simulation findings are discussed.

  4. Experiments Using Cell Phones in Physics Classroom Education: The Computer-Aided g Determination

    Science.gov (United States)

    Vogt, Patrik; Kuhn, Jochen; Müller, Sebastian

    2011-09-01

    This paper continues the collection of experiments that describe the use of cell phones as experimental tools in physics classroom education.1-4 We describe a computer-aided determination of the free-fall acceleration g using the acoustical Doppler effect. The Doppler shift is a function of the speed of the source. Since a free-falling objects speed is changing linearly with time, the Doppler shift is also changing with time. It is possible to measure this shift using software that is both easy to use and readily available. Students will use the time-dependency of the Doppler shift to experimentally determine the acceleration due to gravity by using a cell phone as a freely falling object emitting a sound with constant frequency.

  5. Application of local computer networks in nuclear-physical experiments and technology

    International Nuclear Information System (INIS)

    Foteev, V.A.

    1986-01-01

    The bases of construction, comparative performance and potentialities of local computer networks with respect to their application in physical experiments are considered. The principle of operation of local networks is shown on the basis of the Ethernet network and the results of analysis of their operating performance are given. The examples of operating local networks in the area of nuclear-physics research and nuclear technology are presented as follows: networks of Japan Atomic Energy Research Institute, California University and Los Alamos National Laboratory, network realization according to the DECnet and Fast-bus programs, home network configurations of the USSR Academy of Sciences and JINR Neutron Physical Laboratory etc. It is shown that local networks allows significantly raise productivity in the sphere of data processing

  6. Structure and dynamics of gas phase ions: Interplay between experiments and computations in IRMPD spectroscopy

    Science.gov (United States)

    Coletti, Cecilia; Corinti, Davide; Paciotti, Roberto; Re, Nazzareno; Crestoni, Maria Elisa; Fornarini, Simonetta

    2017-11-01

    The investigation of the molecular structure and dynamics of ions in gas phase is an item of increasing interest, due the role such species play in many areas of chemistry and physics, not to mention that they often represent elusive intermediates in more complex reaction mechanisms. Infrared Multiple Photon Dissociation spectroscopy is today one of the most advanced technique to this purpose, because of its high sensitivity to even small structure changes. The interpretation of IRMPD spectra strongly relies on high level quantum mechanical computations, so that a close interplay is needed for a detailed understanding of structure and kinetics properties which can be gathered from the many applications of this powerful technique. Recent advances in experiment and theory in this field are here illustrated, with emphasis on recent progresses for the elucidation of the mechanism of action of cisplatin, one of the most widely used anticancer drugs.

  7. Experiment and computation: a combined approach to study the van der Waals complexes

    Directory of Open Access Journals (Sweden)

    Surin L.A.

    2017-01-01

    Full Text Available A review of recent results on the millimetre-wave spectroscopy of weakly bound van der Waals complexes, mostly those which contain H2 and He, is presented. In our work, we compared the experimental spectra to the theoretical bound state results, thus providing a critical test of the quality of the M–H2 and M–He potential energy surfaces (PESs which are a key issue for reliable computations of the collisional excitation and de-excitation of molecules (M = CO, NH3, H2O in the dense interstellar medium. The intermolecular interactions with He and H2 play also an important role for high resolution spectroscopy of helium or para-hydrogen clusters doped by a probe molecule (CO, HCN. Such experiments are directed on the detection of superfluid response of molecular rotation in the He and p-H2 clusters.

  8. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    International Nuclear Information System (INIS)

    Nash, T.

    1989-01-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC systems, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described. (orig.)

  9. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    Science.gov (United States)

    Nash, Thomas

    1989-12-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC system, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described.

  10. Comparing Experiment and Computation of Hypersonic Laminar Boundary Layers with Isolated Roughness

    Science.gov (United States)

    Bathel, Brett F.; Iyer, Prahladh S.; Mahesh, Krishnan; Danehy, Paul M.; Inman, Jennifer A.; Jones, Stephen B.; Johansen, Craig T.

    2014-01-01

    Streamwise velocity profile behavior in a hypersonic laminar boundary layer in the presence of an isolated roughness element is presented for an edge Mach number of 8.2. Two different roughness element types are considered: a 2-mm tall, 4-mm diameter cylinder, and a 2-mm radius hemisphere. Measurements of the streamwise velocity behavior using nitric oxide (NO) planar laser-induced fluorescence (PLIF) molecular tagging velocimetry (MTV) have been performed on a 20-degree wedge model. The top surface of this model acts as a flat-plate and is oriented at 5 degrees with respect to the freestream flow. Computations using direct numerical simulation (DNS) of these flows have been performed and are compared to the measured velocity profiles. Particular attention is given to the characteristics of velocity profiles immediately upstream and downstream of the roughness elements. In these regions, the streamwise flow can experience strong deceleration or acceleration. An analysis in which experimentally measured MTV profile displacements are compared with DNS particle displacements is performed to determine if the assumption of constant velocity over the duration of the MTV measurement is valid. This assumption is typically made when reporting MTV-measured velocity profiles, and may result in significant errors when comparing MTV measurements to computations in regions with strong deceleration or acceleration. The DNS computations with the cylindrical roughness element presented in this paper were performed with and without air injection from a rectangular slot upstream of the cylinder. This was done to determine the extent to which gas seeding in the MTV measurements perturbs the boundary layer flowfield.

  11. Quantum Information, computation and cryptography. An introductory survey of theory, technology and experiments

    International Nuclear Information System (INIS)

    Benatti, Fabio; Fannes, Mark; Floreanini, Roberto; Petritis, Dimitri

    2010-01-01

    This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation. (orig.)

  12. Successful experiences in the application of Concept Maps in Engineering in Computing, Mexico

    Directory of Open Access Journals (Sweden)

    Beatriz Guardian Soto

    2013-02-01

    Full Text Available Today there is an enormous amount of work related to new models and styles of learning and instruction in the field of engineering. In the case of the engineering degree in computing that is taught in the Mexico National Polytechnic Institute (IPN, there is a working group led by an expert of international waisted whose success and work thereon, processes are reflected in this text through experiences gained in the last 8 years with students and teachers, thus generatingthe requirements and tools for the globalised world and the knowledge society in which we find ourselves. Lessons learned are in subjects as the theory of automata (TA, compilers (Cs, analysis of algorithms (AA, (R, Artificial Intelligence (AI, computer programming (P networks, degree project (PT and strategic planning (PE mainly, among others to facilitate the understanding of concepts and applications by the student and believe that through the teaching strategy using concept maps developed by j. Novak results have been favorable in dynamism, understanding and generating meaningful learning in the long term, providing well, solid elements for your professional practice. Listed proposals obtained by teachers and exercises developed by teachers and students.

  13. Sequential designs for sensitivity analysis of functional inputs in computer experiments

    International Nuclear Information System (INIS)

    Fruth, J.; Roustant, O.; Kuhnt, S.

    2015-01-01

    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. - Highlights: • Sensitivity analysis method for functional and scalar inputs is presented. • We focus on the discovery of most influential parts of the functional domain. • We investigate economical sequential methodology based on piecewise constant functions. • Normalized sensitivity indices are introduced and investigated theoretically. • Successful application to sheet metal forming on two functional inputs

  14. Reduction of community alcohol problems: computer simulation experiments in three counties.

    Science.gov (United States)

    Holder, H D; Blose, J O

    1987-03-01

    A series of alcohol abuse prevention strategies was evaluated using computer simulation for three counties in the United States: Wake County, North Carolina, Washington County, Vermont and Alameda County, California. A system dynamics model composed of a network of interacting variables was developed for the pattern of alcoholic beverage consumption in a community. The relationship of community drinking patterns to various stimulus factors was specified in the model based on available empirical research. Stimulus factors included disposable income, alcoholic beverage prices, advertising exposure, minimum drinking age and changes in cultural norms. After a generic model was developed and validated on the national level, a computer-based system dynamics model was developed for each county, and a series of experiments was conducted to project the potential impact of specific prevention strategies. The project concluded that prevention efforts can both lower current levels of alcohol abuse and reduce projected increases in alcohol-related problems. Without such efforts, already high levels of alcohol-related family disruptions in the three counties could be expected to rise an additional 6% and drinking-related work problems 1-5%, over the next 10 years after controlling for population growth. Of the strategies tested, indexing the price of alcoholic beverages to the consumer price index in conjunction with the implementation of a community educational program with well-defined target audiences has the best potential for significant problem reduction in all three counties.

  15. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    Science.gov (United States)

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  16. The Effects of Video Game Experience on Computer-Based Air Traffic Controller Specialist, Air Traffic Scenario Test Scores.

    Science.gov (United States)

    1997-02-01

    application with a strong resemblance to a video game , concern has been raised that prior video game experience might have a moderating effect on scores. Much...such as spatial ability. The effects of computer or video game experience on work sample scores have not been systematically investigated. The purpose...of this study was to evaluate the incremental validity of prior video game experience over that of general aptitude as a predictor of work sample test

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  18. Ion bombardment induced smoothing of amorphous metallic surfaces: Experiments versus computer simulations

    International Nuclear Information System (INIS)

    Vauth, Sebastian; Mayr, S. G.

    2008-01-01

    Smoothing of rough amorphous metallic surfaces by bombardment with heavy ions in the low keV regime is investigated by a combined experimental-simulational study. Vapor deposited rough amorphous Zr 65 Al 7.5 Cu 27.5 films are the basis for systematic in situ scanning tunneling microscopy measurements on the smoothing reaction due to 3 keV Kr + ion bombardment. The experimental results are directly compared to the predictions of a multiscale simulation approach, which incorporates stochastic rate equations of the Langevin type in combination with previously reported classical molecular dynamics simulations [Phys. Rev. B 75, 224107 (2007)] to model surface smoothing across length and time scales. The combined approach of experiments and simulations clearly corroborates a key role of ion induced viscous flow and ballistic effects in low keV heavy ion induced smoothing of amorphous metallic surfaces at ambient temperatures

  19. When STAR meets the Clouds-Virtualization and Cloud Computing Experiences

    International Nuclear Information System (INIS)

    Lauret, J; Hajdu, L; Walker, M; Balewski, J; Goasguen, S; Stout, L; Fenn, M; Keahey, K

    2011-01-01

    In recent years, Cloud computing has become a very attractive paradigm and popular model for accessing distributed resources. The Cloud has emerged as the next big trend. The burst of platform and projects providing Cloud resources and interfaces at the very same time that Grid projects are entering a production phase in their life cycle has however raised the question of the best approach to handling distributed resources. Especially, are Cloud resources scaling at the levels shown by Grids? Are they performing at the same level? What is their overhead on the IT teams and infrastructure? Rather than seeing the two as orthogonal, the STAR experiment has viewed them as complimentary and has studied merging the best of the two worlds with Grid middleware providing the aggregation of both Cloud and traditional resources. Since its first use of Cloud resources on Amazon EC2 in 2008/2009 using a Nimbus/EC2 interface, the STAR software team has tested and experimented with many novel approaches: from a traditional, native EC2 approach to the Virtual Organization Cluster (VOC) at Clemson University and Condor/VM on the GLOW resources at the University of Wisconsin. The STAR team is also planning to run as part of the DOE/Magellan project. In this paper, we will present an overview of our findings from using truly opportunistic resources and scaling-out two orders of magnitude in both tests and practical usage.

  20. A FPGA-based Network Interface Card with GPUDirect enabling realtime GPU computing in HEP experiments

    CERN Document Server

    Lonardo, Alessandro; Ammendola, Roberto; Biagioni, Andrea; Cotta Ramusino, Angelo; Fiorini, Massimiliano; Frezza, Ottorino; Lamanna, Gianluca; Lo Cicero, Francesca; Martinelli, Michele; Neri, Ilaria; Paolucci, Pier Stanislao; Pastorelli, Elena; Pontisso, Luca; Rossetti, Davide; Simeone, Francesco; Simula, Francesco; Sozzi, Marco; Tosoratto, Laura; Vicini, Piero

    2015-01-01

    The capability of processing high bandwidth data streams in real-time is a computational requirement common to many High Energy Physics experiments. Keeping the latency of the data transport tasks under control is essential in order to meet this requirement. We present NaNet, a FPGA-based PCIe Network Interface Card design featuring Remote Direct Memory Access towards CPU and GPU memories plus a transport protocol offload module characterized by cycle-accurate upper-bound handling. The combination of these two features allows to relieve almost entirely the OS and the application from data tranfer management, minimizing the unavoidable jitter effects associated to OS process scheduling. The design currently supports one GbE (1000Base-T) and three custom 34 Gbps APElink I/O channels, but four-channels 10GbE (10Base-R) and 2.5 Gbps deterministic latency KM3link versions are being implemented. Two use cases of NaNet will be discussed: the GPU-based low level trigger for the RICH detector in the NA62 experiment an...

  1. Single-polymer dynamics under constraints: scaling theory and computer experiment

    International Nuclear Information System (INIS)

    Milchev, Andrey

    2011-01-01

    The relaxation, diffusion and translocation dynamics of single linear polymer chains in confinement is briefly reviewed with emphasis on the comparison between theoretical scaling predictions and observations from experiment or, most frequently, from computer simulations. Besides cylindrical, spherical and slit-like constraints, related problems such as the chain dynamics in a random medium and the translocation dynamics through a nanopore are also considered. Another particular kind of confinement is imposed by polymer adsorption on attractive surfaces or selective interfaces-a short overview of single-chain dynamics is also contained in this survey. While both theory and numerical experiments consider predominantly coarse-grained models of self-avoiding linear chain molecules with typically Rouse dynamics, we also note some recent studies which examine the impact of hydrodynamic interactions on polymer dynamics in confinement. In all of the aforementioned cases we focus mainly on the consequences of imposed geometric restrictions on single-chain dynamics and try to check our degree of understanding by assessing the agreement between theoretical predictions and observations. (topical review)

  2. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  3. Computer simulation of void formation in residual gas atom free metals by dual beam irradiation experiments

    International Nuclear Information System (INIS)

    Shimomura, Y.; Nishiguchi, R.; La Rubia, T.D. de; Guinan, M.W.

    1992-01-01

    In our recent experiments (1), we found that voids nucleate at vacancy clusters which trap gas atoms such as hydrogen and helium in ion- and neutron-irradiated copper. A molecular dynamics computer simulation, which implements an empirical embedded atom method to calculate forces that act on atoms in metals, suggests that a void nucleation occurs in pure copper at six and seven vacancy clusters. The structure of six and seven vacancy clusters in copper fluctuates between a stacking fault tetrahedron and a void. When a hydrogen is trapped at voids of six and seven vacancy, a void can keep their structure for appreciably long time; that is, the void do not relax to a stacking fault tetrahedron and grows to a large void. In order to explore the detailed atomics of void formation, it is emphasized that dual-beam irradiation experiments that utilize beams of gas atoms and self-ions should be carried out with residual gas atom free metal specimens. (author)

  4. Infragravity wave generation and dynamics over a mild slope beach : Experiments and numerical computations

    Science.gov (United States)

    Cienfuegos, R.; Duarte, L.; Hernandez, E.

    2008-12-01

    Charasteristic frequencies of gravity waves generated by wind and propagating towards the coast are usually comprised between 0.05Hz and 1Hz. Nevertheless, lower frequecy waves, in the range of 0.001Hz and 0.05Hz, have been observed in the nearshore zone. Those long waves, termed as infragravity waves, are generated by complex nonlinear mechanisms affecting the propagation of irregular waves up to the coast. The groupiness of an incident random wave field may be responsible for producing a slow modulation of the mean water surface thus generating bound long waves travelling at the group speed. Similarly, a quasi- periodic oscillation of the break-point location, will be accompained by a slow modulation of set-up/set-down in the surf zone and generation and release of long waves. If the primary structure of the carrying incident gravity waves is destroyed (e.g. by breaking), forced long waves can be freely released and even reflected at the coast. Infragravity waves can affect port operation through resonating conditions, or strongly affect sediment transport and beach morphodynamics. In the present study we investigate infragravity wave generation mechanisms both, from experiments and numerical computations. Measurements were conducted at the 70-meter long wave tank, located at the Instituto Nacional de Hidraulica (Chile), prepared with a beach of very mild slope of 1/80 in order to produce large surf zone extensions. A random JONSWAP type wave field (h0=0.52m, fp=0.25Hz, Hmo=0.17m) was generated by a piston wave-maker and measurements of the free surface displacements were performed all over its length at high spatial resolution (0.2m to 1m). Velocity profiles were also measured at four verticals inside the surf zone using an ADV. Correlation maps of wave group envelopes and infragravity waves are computed in order to identify long wave generation and dynamics in the experimental set-up. It appears that both mechanisms (groupiness and break-point oscillation) are

  5. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  7. Computational design of auxotrophy-dependent microbial biosensors for combinatorial metabolic engineering experiments.

    Science.gov (United States)

    Tepper, Naama; Shlomi, Tomer

    2011-01-21

    Combinatorial approaches in metabolic engineering work by generating genetic diversity in a microbial population followed by screening for strains with improved phenotypes. One of the most common goals in this field is the generation of a high rate chemical producing strain. A major hurdle with this approach is that many chemicals do not have easy to recognize attributes, making their screening expensive and time consuming. To address this problem, it was previously suggested to use microbial biosensors to facilitate the detection and quantification of chemicals of interest. Here, we present novel computational methods to: (i) rationally design microbial biosensors for chemicals of interest based on substrate auxotrophy that would enable their high-throughput screening; (ii) predict engineering strategies for coupling the synthesis of a chemical of interest with the production of a proxy metabolite for which high-throughput screening is possible via a designed bio-sensor. The biosensor design method is validated based on known genetic modifications in an array of E. coli strains auxotrophic to various amino-acids. Predicted chemical production rates achievable via the biosensor-based approach are shown to potentially improve upon those predicted by current rational strain design approaches. (A Matlab implementation of the biosensor design method is available via http://www.cs.technion.ac.il/~tomersh/tools).

  8. Initial experience with computed tomographic colonography applied for noncolorectal cancerous conditions

    International Nuclear Information System (INIS)

    Ichikawa, Tamaki; Kawada, Shuichi; Hirata, Satoru; Ikeda, Shu; Sato, Yuuki; Imai, Yutaka

    2011-01-01

    The aim of this study was to asses retrospectively the performance of computed tomography colonography (CTC) for noncolorectal cancerous conditions. A total of 44 patients with non-colorectal cancerous conditions underwent CTC. We researched the indications for CTC or present illness and evaluated the CTC imaging findings. We assessed whether diagnosis by CTC reduced conventional colonoscopic examinations. A total of 47 examinations were performed in 44 patients. The indications for CTC or a present illness were as follows: 15 patients with impossible or incomplete colonoscopy, 7 with diverticular disease, 6 with malignancy (noncolorectal cancer), 6 with Crohn's disease, 4 suspected to have a submucosal tumor on colonoscopy, 2 with ischemic colitis, and 4 with various other diseases. Colonic findings were diagnosed on CTC in 36 examinations, and extracolonic findings were identified in 35 of 44 patients. In all, 17 patients had undergone colonoscopy previously, 9 (52.9%) of whom did not require further colonoscopy by CTC. Five patients underwent colonoscopy after CTC. The indications for CTC were varied for patients with noncolorectal cancerous conditions. CTC examinations could be performed safely. Unlike colonoscopy or CT without preparation, CTC revealed colonic and extracolonic findings and may reduce the indication of colonoscopy in patients with noncolorectal cancerous conditions. (author)

  9. Using the genome aggregation database, computational pathogenicity prediction tools, and patch clamp heterologous expression studies to demote previously published long QT syndrome type 1 mutations from pathogenic to benign.

    Science.gov (United States)

    Clemens, Daniel J; Lentino, Anne R; Kapplinger, Jamie D; Ye, Dan; Zhou, Wei; Tester, David J; Ackerman, Michael J

    2018-04-01

    Mutations in the KCNQ1-encoded Kv7.1 potassium channel cause long QT syndrome (LQTS) type 1 (LQT1). It has been suggested that ∼10%-20% of rare LQTS case-derived variants in the literature may have been published erroneously as LQT1-causative mutations and may be "false positives." The purpose of this study was to determine which previously published KCNQ1 case variants are likely false positives. A list of all published, case-derived KCNQ1 missense variants (MVs) was compiled. The occurrence of each MV within the Genome Aggregation Database (gnomAD) was assessed. Eight in silico tools were used to predict each variant's pathogenicity. Case-derived variants that were either (1) too frequently found in gnomAD or (2) absent in gnomAD but predicted to be pathogenic by ≤2 tools were considered potential false positives. Three of these variants were characterized functionally using whole-cell patch clamp technique. Overall, there were 244 KCNQ1 case-derived MVs. Of these, 29 (12%) were seen in ≥10 individuals in gnomAD and are demotable. However, 157 of 244 MVs (64%) were absent in gnomAD. Of these, 7 (4%) were predicted to be pathogenic by ≤2 tools, 3 of which we characterized functionally. There was no significant difference in current density between heterozygous KCNQ1-F127L, -P477L, or -L619M variant-containing channels compared to KCNQ1-WT. This study offers preliminary evidence for the demotion of 32 (13%) previously published LQT1 MVs. Of these, 29 were demoted because of their frequent sighting in gnomAD. Additionally, in silico analysis and in vitro functional studies have facilitated the demotion of 3 ultra-rare MVs (F127L, P477L, L619M). Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  10. Non-invasive coronary angiography with multislice computed tomography. Technology, methods, preliminary experience and prospects.

    Science.gov (United States)

    Traversi, Egidio; Bertoli, Giuseppe; Barazzoni, Giancarlo; Baldi, Maurizia; Tramarin, Roberto

    2004-02-01

    The recent technical developments in multislice computed tomography (MSCT), with ECG retro-gated image reconstruction, have elicited great interest in the possibility of accurate non-invasive imaging of the coronary arteries. The latest generation of MSCT systems with 8-16 rows of detectors permits acquisition of the whole cardiac volume during a single 15-20 s breath-hold with a submillimetric definition of the images and an outstanding signal-to-noise ratio. Thus the race which, between MSCT, electron beam computed tomography and cardiac magnetic resonance imaging, can best provide routine and reliable imaging of the coronary arteries in clinical practice has recommenced. Currently available MSCT systems offer different options for both cardiac image acquisition and reconstruction, including multiplanar and curved multiplanar reconstruction, three-dimensional volume rendering, maximum intensity projection, and virtual angioscopy. In our preliminary experience including 176 patients suffering from known or suspected coronary artery disease, MSCT was feasible in 161 (91.5%) and showed a sensitivity of 80.4% and a specificity of 80.3%, with respect to standard coronary angiography, in detecting critical stenosis in coronary arteries and artery or venous bypass grafts. These results correspond to a positive predictive value of 58.6% and a negative predictive value of 92.2%. The true role that MSCT is likely to play in the future in non-invasive coronary imaging is still to be defined. Nevertheless, the huge amount of data obtainable by MSCT along with the rapid technological advances, shorter acquisition times and reconstruction algorithm developments will make the technique stronger, and possible applications are expected not only for non-invasive coronary angiography, but also for cardiac function and myocardial perfusion evaluation, as an all-in-one examination.

  11. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  12. FCJ-133 The Scripted Spaces of Urban Ubiquitous Computing: The experience, poetics, and politics of public scripted space

    Directory of Open Access Journals (Sweden)

    Christian Ulrik Andersen

    2011-12-01

    Full Text Available This article proposes and introduces the concept of ‘scripted space’ as a new perspective on ubiquitous computing in urban environments. Drawing on urban history, computer games, and a workshop study of the city of Lund the article discusses the experience of digitally scripted spaces, and their relation to the history of public spaces. In conclusion, the article discusses the potential for employing scripted spaces as a reinvigoration of urban public space.

  13. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  14. PanDA: A New Paradigm for Distributed Computing in HEP Through the Lens of ATLAS and other Experiments

    CERN Document Server

    De, K; The ATLAS collaboration; Maeno, T; Nilsson, P; Wenaus, T

    2014-01-01

    Experiments at the Large Hadron Collider (LHC) face unprecedented computing challenges. Heterogeneous resources are distributed worldwide, thousands of physicists analyzing the data need remote access to hundreds of computing sites, the volume of processed data is beyond the exabyte scale, and data processing requires more than a billion hours of computing usage per year. The PanDA (Production and Distributed Analysis) system was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. In the process, the old batch job paradigm of computing in HEP was discarded in favor of a far more flexible and scalable model. The success of PanDA in ATLAS is leading to widespread adoption and testing by other experiments. PanDA is the first exascale workload management system in HEP, already operating at a million computing jobs per day, and processing over an exabyte of data in 2013. We will describe the design and implementation of PanDA, present data on the performance of PanDA a...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. Development of a real-time monitoring system and integration of different computer system in LHD experiments using IP multicast

    International Nuclear Information System (INIS)

    Emoto, Masahiko; Nakamura, Yukio; Teramachi, Yasuaki; Okumura, Haruhiko; Yamaguchi, Satarou

    2002-01-01

    There are several different computer systems in LHD (Large Helical Device) experiment, and therefore the coalition of these computers is a key to perform the experiment. Real-time monitoring system is also important because the long discharge is needed in the LHD experiment. In order to achieve these two requirements, the technique of IP multicast is adopted. The authors have developed three new systems, the first one is the real-time monitoring system, the next one is the delivery system of the shot number and the last one is the real-time notification system of the plasma data registration. The first system can deliver the real-time monitoring data to the LHD experimental LAN through the firewall of the LHD control LAN in NIFS. The other two systems are used to realize high coalition of the different computers in the LHD plasma experiment. We can conclude that IP multicast is very useful both in the LHD experiment and a future large plasma experiment from various experiences. (author)

  20. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  1. Computers in medical education 2. Use of a computer package to supplement the clinical experience in a surgical clerkship: an objective evaluation.

    Science.gov (United States)

    Devitt, P; Cehic, D; Palmer, E

    1998-06-01

    Student teaching of surgery has been devolved from the university in an effort to increase and broaden undergraduate clinical experience. In order to ensure uniformity of learning we have defined learning objectives and provided a computer-based package to supplement clinical teaching. A study was undertaken to evaluate the place of computer-based learning in a clinical environment. Twelve modules were provided for study during a 6-week attachment. These covered clinical problems related to cardiology, neurosurgery and gastrointestinal haemorrhage. Eighty-four fourth-year students undertook a pre- and post-test assessment on these three topics as well as acute abdominal pain. No extra learning material on the latter topic was provided during the attachment. While all students showed significant improvement in performance in the post-test assessment, those who had access to the computer material performed significantly better than did the controls. Within the topics, students in both groups performed equally well on the post-test assessment of acute abdominal pain but the control group's performance was significantly lacking on the topic of gastrointestinal haemorrhage, suggesting that the bulk of learning on this subject came from the computer material and little from the clinical attachment. This type of learning resource can be used to supplement the student's clinical experience and at the same time monitor what they learn during clinical clerkships and identify areas of weakness.

  2. The growth of language: Universal Grammar, experience, and principles of computation.

    Science.gov (United States)

    Yang, Charles; Crain, Stephen; Berwick, Robert C; Chomsky, Noam; Bolhuis, Johan J

    2017-10-01

    Human infants develop language remarkably rapidly and without overt instruction. We argue that the distinctive ontogenesis of child language arises from the interplay of three factors: domain-specific principles of language (Universal Grammar), external experience, and properties of non-linguistic domains of cognition including general learning mechanisms and principles of efficient computation. We review developmental evidence that children make use of hierarchically composed structures ('Merge') from the earliest stages and at all levels of linguistic organization. At the same time, longitudinal trajectories of development show sensitivity to the quantity of specific patterns in the input, which suggests the use of probabilistic processes as well as inductive learning mechanisms that are suitable for the psychological constraints on language acquisition. By considering the place of language in human biology and evolution, we propose an approach that integrates principles from Universal Grammar and constraints from other domains of cognition. We outline some initial results of this approach as well as challenges for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Computational modeling of direct-drive fusion pellets and KrF-driven foil experiments

    International Nuclear Information System (INIS)

    Gardner, J.H.; Schmitt, A.J.; Dahlburg, J.P.; Pawley, C.J.; Bodner, S.E.; Obenschain, S.P.; Serlin, V.; Aglitskiy, Y.

    1998-01-01

    FAST is a radiation transport hydrodynamics code that simulates laser matter interactions of relevance to direct-drive laser fusion target design. FAST solves the Euler equations of compressible flow using the Flux-Corrected Transport finite volume method. The advection algorithm provides accurate computation of flows from nearly incompressible vortical flows to those that are highly compressible and dominated by strong pressure and density gradients. In this paper we describe the numerical techniques and physics packages. FAST has also been benchmarked with Nike laser facility experiments in which linearly perturbed, low adiabat planar plastic targets are ablatively accelerated to velocities approaching 10 7 cm/s. Over a range of perturbation wavelengths, the code results agree with the measured Rayleigh endash Taylor growth from the linear through the deeply nonlinear regimes. FAST has been applied to the two-dimensional spherical simulation design to provide surface finish and laser bandwidth tolerances for a promising new direct-drive pellet that uses a foam ablator

  4. Comparisons of LES and RANS Computations with PIV Experiments on a Cylindrical Cavity Flow

    Directory of Open Access Journals (Sweden)

    Wen-Tao Su

    2013-01-01

    Full Text Available A comparison study on the numerical computations by large eddy simulation (LES and Reynolds-averaged Navier-Stokes (RANS methods with experiment on a cylindrical cavity flow was conducted in this paper. Numerical simulations and particle image velocimetry (PIV measurement were performed for two Reynolds numbers of the flow at a constant aspect ratio of H/R = 2.4 (R is the radius of the cylindrical cavity, and H is liquid level. The three components of velocity were extracted from 100 sequential PIV measured velocity frames with averaging, in order to illustrate the axial jet flow evolution and circulation distribution in the radial direction. The results show that LES can reproduce well the fine structure inside the swirling motions in both the meridional and the horizontal planes, as well as the distributions of velocity components and the circulation, in good agreement with experimental results, while the RANS method only provided a rough trend of inside vortex structure. Based on the analysis of velocity profiles at various locations, it indicates that LES is more suitable for predicting the complex flow characteristics inside complicated three-dimensional geometries.

  5. Experiment on a novel user input for computer interface utilizing tongue input for the severely disabled.

    Science.gov (United States)

    Kencana, Andy Prima; Heng, John

    2008-11-01

    This paper introduces a novel passive tongue control and tracking device. The device is intended to be used by the severely disabled or quadriplegic person. The main focus of this device when compared to the other existing tongue tracking devices is that the sensor employed is passive which means it requires no powered electrical sensor to be inserted into the user's mouth and hence no trailing wires. This haptic interface device employs the use of inductive sensors to track the position of the user's tongue. The device is able perform two main PC functions that of the keyboard and mouse function. The results show that this device allows the severely disabled person to have some control in his environment, such as to turn on and off or control daily electrical devices or appliances; or to be used as a viable PC Human Computer Interface (HCI) by tongue control. The operating principle and set-up of such a novel passive tongue HCI has been established with successful laboratory trials and experiments. Further clinical trials will be required to test out the device on disabled persons before it is ready for future commercial development.

  6. Estimating the Diffusion Coefficients of Sugars Using Diffusion Experiments in Agar-Gel and Computer Simulations.

    Science.gov (United States)

    Miyamoto, Shuichi; Atsuyama, Kenji; Ekino, Keisuke; Shin, Takashi

    2018-01-01

    The isolation of useful microbes is one of the traditional approaches for the lead generation in drug discovery. As an effective technique for microbe isolation, we recently developed a multidimensional diffusion-based gradient culture system of microbes. In order to enhance the utility of the system, it is favorable to have diffusion coefficients of nutrients such as sugars in the culture medium beforehand. We have, therefore, built a simple and convenient experimental system that uses agar-gel to observe diffusion. Next, we performed computer simulations-based on random-walk concepts-of the experimental diffusion system and derived correlation formulas that relate observable diffusion data to diffusion coefficients. Finally, we applied these correlation formulas to our experimentally-determined diffusion data to estimate the diffusion coefficients of sugars. Our values for these coefficients agree reasonably well with values published in the literature. The effectiveness of our simple technique, which has elucidated the diffusion coefficients of some molecules which are rarely reported (e.g., galactose, trehalose, and glycerol) is demonstrated by the strong correspondence between the literature values and those obtained in our experiments.

  7. Coronary computed tomography angiography with 320-row detector and using the AIDR-3D: initial experience

    International Nuclear Information System (INIS)

    Sasdelli Neto, Roberto; Nomura, Cesar Higa; Macedo, Ana Carolina Sandoval; Bianco, Danilo Perussi; Kay, Fernando Uliana; Szarf, Gilberto; Teles, Gustavo Borges da Silva; Shoji, Hamilton; Santana Netto, Pedro Vieira; Passos, Rodrigo Bastos Duarte; Chate, Rodrigo Caruso; Ishikawa, Walther Yoshiharu; Lima, Joao Paulo Bacellar Costa; Rocha, Marcelo Assis; Marcos, Vinicius Neves; Funari, Marcelo Buarque de Gusmao; Failla, Bruna Bonaventura

    2013-01-01

    Coronary computed tomography angiography (coronary CTA) is a powerful non-invasive imaging method to evaluate coronary artery disease. Nowadays, coronary CTA estimated effective radiation dose can be dramatically reduced using state-of-the-art scanners, such as 320-row detector CT (320-CT), without changing coronary CTA diagnostic accuracy. To optimize and further reduce the radiation dose, new iterative reconstruction algorithms were released recently by several CT manufacturers, and now they are used routinely in coronary CTA. This paper presents our first experience using coronary CTA with 320-CT and the Adaptive Iterative Dose Reduction 3D (AIDR-3D). In addition, we describe the current indications for coronary CTA in our practice as well as the acquisition standard protocols and protocols related to CT application for radiation dose reduction. In conclusion, coronary CTA radiation dose can be dramatically reduced following the 'as low as reasonable achievable' principle by combination of exam indication and well-documented technics for radiation dose reduction, such as beta blockers, low-kV, and also the newest iterative dose reduction software as AIDR-3D. (author)

  8. Movement-to-music computer technology: a developmental play experience for children with severe physical disabilities.

    Science.gov (United States)

    Tam, Cynthia; Schwellnus, Heidi; Eaton, Ceilidh; Hamdani, Yani; Lamont, Andrea; Chau, Tom

    2007-01-01

    Children with severe physical disabilities often lack the physical skills to explore their environment independently, and to play with toys or musical instruments. The movement-to-music (MTM) system is an affordable computer system that allows children with limited movements to play and create music. The present study explored parents' experiences of using the MTM system with their children. A qualitative methodology employing in-depth interview techniques was used with six mothers and their children. The themes extracted from the data were organized under two main concepts of the International Classification of Functioning, Disability, and Health (ICF) (WHO, 2001) framework. The results showed that the MTM expanded horizons for the child along the ICF health dimensions and the MTM had a positive impact on ICF environmental determinants of health. The small sample size should be noted as a limitation of this study. Further research should be carried out with a larger sample of children with restricted mobility to obtain a better understanding of the impact of MTM technology on children's psychosocial development.

  9. Computer Game-Based Learning: Perceptions and Experiences of Senior Chinese Adults

    Science.gov (United States)

    Wang, Feihong; Lockee, Barbara B.; Burton, John K.

    2012-01-01

    The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…

  10. Computer-aided digitization of graphical mass flow data from the 1/5-scale Mark I BWR pressure suppression experiment

    International Nuclear Information System (INIS)

    Holman, G.S.; McCauley, E.W.

    1979-01-01

    Periodically in the analysis of engineering data, it becomes necessary to use graphical output as the solitary source of accurate numerical data for use in subsequent calculations. Such was our experience in the extended analysis of data from the 1/5-scale Mark I boiling water reactor pressure suppression experiment (PSE). The original numerical results of extensive computer calculations performed at the time of the actual PSE tests and required for the later extended analysis program had not been retained as archival records. We were, therefore, required to recover the previously calculated data, either by a complete recalculation or from available computer graphics records. Time constraints suggested recovery from the graphics records as the more viable approach. This report describes two different approaches to recovery of digital data from graphics records. One, combining hard and software techniques immediately available to us at LLL, proved to be inadequate for our purposes. The other approach required the development of pure software techniques that interfaced with LLL computer graphics to unpack digital coordinate information directly from graphics files. As a result of this effort, we were able to recover the required data with no significant loss in the accuracy of the original calculations

  11. Elucidating reactivity regimes in cyclopentane oxidation: Jet stirred reactor experiments, computational chemistry, and kinetic modeling

    KAUST Repository

    Rachidi, Mariam El; Thion, Sé bastien; Togbé , Casimir; Dayma, Guillaume; Mehl, Marco; Dagaut, Philippe; Pitz, William J.; Zá dor, Judit; Sarathy, Mani

    2016-01-01

    This study is concerned with the identification and quantification of species generated during the combustion of cyclopentane in a jet stirred reactor (JSR). Experiments were carried out for temperatures between 740 and 1250K, equivalence ratios from 0.5 to 3.0, and at an operating pressure of 10atm. The fuel concentration was kept at 0.1% and the residence time of the fuel/O/N mixture was maintained at 0.7s. The reactant, product, and intermediate species concentration profiles were measured using gas chromatography and Fourier transform infrared spectroscopy. The concentration profiles of cyclopentane indicate inhibition of reactivity between 850-1000K for ϕ = 2.0 and ϕ = 3.0. This behavior is interesting, as it has not been observed previously for other fuel molecules, cyclic or non-cyclic. A kinetic model including both low- and high-temperature reaction pathways was developed and used to simulate the JSR experiments. The pressure-dependent rate coefficients of all relevant reactions lying on the PES of cyclopentyl+O, as well as the C-C and C-H scission reactions of the cyclopentyl radical were calculated at the UCCSD(T)-F12b/cc-pVTZ-F12//M06-2X/6-311++G(d,p) level of theory. The simulations reproduced the unique reactivity trend of cyclopentane and the measured concentration profiles of intermediate and product species. Sensitivity and reaction path analyses indicate that this reactivity trend may be attributed to differences in the reactivity of allyl radical at different conditions, and it is highly sensitive to the C-C/C-H scission branching ratio of the cyclopentyl radical decomposition.

  12. Elucidating reactivity regimes in cyclopentane oxidation: Jet stirred reactor experiments, computational chemistry, and kinetic modeling

    KAUST Repository

    Rachidi, Mariam El

    2016-06-23

    This study is concerned with the identification and quantification of species generated during the combustion of cyclopentane in a jet stirred reactor (JSR). Experiments were carried out for temperatures between 740 and 1250K, equivalence ratios from 0.5 to 3.0, and at an operating pressure of 10atm. The fuel concentration was kept at 0.1% and the residence time of the fuel/O/N mixture was maintained at 0.7s. The reactant, product, and intermediate species concentration profiles were measured using gas chromatography and Fourier transform infrared spectroscopy. The concentration profiles of cyclopentane indicate inhibition of reactivity between 850-1000K for ϕ = 2.0 and ϕ = 3.0. This behavior is interesting, as it has not been observed previously for other fuel molecules, cyclic or non-cyclic. A kinetic model including both low- and high-temperature reaction pathways was developed and used to simulate the JSR experiments. The pressure-dependent rate coefficients of all relevant reactions lying on the PES of cyclopentyl+O, as well as the C-C and C-H scission reactions of the cyclopentyl radical were calculated at the UCCSD(T)-F12b/cc-pVTZ-F12//M06-2X/6-311++G(d,p) level of theory. The simulations reproduced the unique reactivity trend of cyclopentane and the measured concentration profiles of intermediate and product species. Sensitivity and reaction path analyses indicate that this reactivity trend may be attributed to differences in the reactivity of allyl radical at different conditions, and it is highly sensitive to the C-C/C-H scission branching ratio of the cyclopentyl radical decomposition.

  13. A dataflow meta-computing framework for event processing in the H1 experiment

    International Nuclear Information System (INIS)

    Campbell, A.; Gerhards, R.; Mkrtchyan, T.; Levonian, S.; Grab, C.; Martyniak, J.; Nowak, J.

    2001-01-01

    Linux based networked PCs clusters are replacing both the VME non uniform direct memory access systems and SMP shared memory systems used previously for the online event filtering and reconstruction. To allow an optimal use of the distributed resources of PC clusters an open software framework is presently being developed based on a dataflow paradigm for event processing. This framework allows for the distribution of the data of physics events and associated calibration data to multiple computers from multiple input sources for processing and the subsequent collection of the processed events at multiple outputs. The basis of the system is the event repository, basically a first-in first-out event store which may be read and written in a manner similar to sequential file access. Events are stored in and transferred between repositories as suitably large sequences to enable high throughput. Multiple readers can read simultaneously from a single repository to receive event sequences and multiple writers can insert event sequences to a repository. Hence repositories are used for event distribution and collection. To support synchronisation of the event flow the repository implements barriers. A barrier must be written by all the writers of a repository before any reader can read the barrier. A reader must read a barrier before it may receive data from behind it. Only after all readers have read the barrier is the barrier removed from the repository. A barrier may also have attached data. In this way calibration data can be distributed to all processing units. The repositories are implemented as multi-threaded CORBA objects in C++ and CORBA is used for all data transfers. Job setup scripts are written in python and interactive status and histogram display is provided by a Java program. Jobs run under the PBS batch system providing shared use of resources for online triggering, offline mass reprocessing and user analysis jobs

  14. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  15. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  16. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  17. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  20. Using the computer-driven VR environment to promote experiences of natural world immersion

    Science.gov (United States)

    Frank, Lisa A.

    2013-03-01

    In December, 2011, over 800 people experienced the exhibit, :"der"//pattern for a virtual environment, created for the fully immersive CAVETM at the University of Wisconsin-Madison. This exhibition took my nature-based photographic work and reinterpreted it for virtual reality (VR).Varied responses such as: "It's like a moment of joy," or "I had to see it twice," or "I'm still thinking about it weeks later" were common. Although an implied goal of my 2D artwork is to create a connection that makes viewers more aware of what it means to be a part of the natural world, these six VR environments opened up an unexpected area of inquiry that my 2D work has not. Even as the experience was mediated by machines, there was a softening at the interface between technology and human sensibility. Somehow, for some people, through the unlikely auspices of a computer-driven environment, the project spoke to a human essence that they connected with in a way that went beyond all expectations and felt completely out of my hands. Other interesting behaviors were noted: in some scenarios some spoke of intense anxiety, acrophobia, claustrophobia-even fear of death when the scene took them underground. These environments were believable enough to cause extreme responses and disorientation for some people; were fun, pleasant and wonder-filled for most; and were liberating, poetic and meditative for many others. The exhibition seemed to promote imaginative skills, creativity, emotional insight, and environmental sensitivity. It also revealed the CAVETM to be a powerful tool that can encourage uniquely productive experiences. Quite by accident, I watched as these nature-based environments revealed and articulated an essential relationship between the human spirit and the physical world. The CAVETM is certainly not a natural space, but there is clear potential to explore virtual environments as a path to better and deeper connections between people and nature. We've long associated contact

  1. Golimumab in patients with active rheumatoid arthritis who have previous experience with tumour necrosis factor inhibitors: results of a long-term extension of the randomised, double-blind, placebo-controlled GO-AFTER study through week 160

    NARCIS (Netherlands)

    Smolen, Josef S.; Kay, Jonathan; Landewé, Robert B. M.; Matteson, Eric L.; Gaylis, Norman; Wollenhaupt, Jurgen; Murphy, Frederick T.; Zhou, Yiying; Hsia, Elizabeth C.; Doyle, Mittie K.

    2012-01-01

    The aim of this study was to assess long-term golimumab therapy in patients with rheumatoid arthritis (RA) who discontinued previous tumour necrosis factor alpha (TNFα) inhibitor(s) for any reason. Results through week 24 of this multicentre, randomised, double-blind, placebo-controlled study of

  2. Gaining Efficiency of Computational Experiments in Modeling the Flight Vehicle Movement

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2017-01-01

    Full Text Available The paper considers one of the important aspects to gain efficiency of conducted computational experiments, namely to provide grid optimization. The problem solution will ultimately create a more perfect system, because just a multivariate simulation is a basis to apply optimization methods by the specified criteria and to identify problems in functioning of technical systems.The paper discusses a class of the moving objects, representing a body of revolution, which, for one reason or another, endures deformation of casing. Analyses using the author's techniques have shown that there are the following complex functional dependencies of aerodynamic characteristics of the studied class of deformed objects.Presents a literature review on new ways for organizing the calculations, data storage and transfer. Provides analysing the methods of forming grids, including those used in initial calculations and visualization of information. In addition to the regular grids, are offered unstructured grids, including those for dynamic spatial-temporal information. Attention is drawn to the problem of an efficient retrieval of information. The paper shows a relevant capability to run with large data volumes, including an OLAP technology, multidimensional cubes (Data Cube, and finally, an integrated Date Mining approach.Despite the huge number of successful modern approaches to the solution of problems of formation, storage and processing of multidimensional data, it should be noted that computationally these tools are quite expensive. Expenditure for using the special tools often exceeds the cost of directly conducted computational experiments as such. In this regard, it was recognized that it is unnecessary to abandon the use of traditional tools and focus on a direct increase of their efficiency. Within the framework of the applied problem under consideration such a tool was to form the optimal grids.The optimal grid was understood to be a grid in the N

  3. Individualized computer-aided education in mammography based on user modeling: concept and preliminary experiments.

    Science.gov (United States)

    Mazurowski, Maciej A; Baker, Jay A; Barnhart, Huiman X; Tourassi, Georgia D

    2010-03-01

    The authors propose the framework for an individualized adaptive computer-aided educational system in mammography that is based on user modeling. The underlying hypothesis is that user models can be developed to capture the individual error making patterns of radiologists-in-training. In this pilot study, the authors test the above hypothesis for the task of breast cancer diagnosis in mammograms. The concept of a user model was formalized as the function that relates image features to the likelihood/extent of the diagnostic error made by a radiologist-in-training and therefore to the level of difficulty that a case will pose to the radiologist-in-training (or "user"). Then, machine learning algorithms were implemented to build such user models. Specifically, the authors explored k-nearest neighbor, artificial neural networks, and multiple regression for the task of building the model using observer data collected from ten Radiology residents at Duke University Medical Center for the problem of breast mass diagnosis in mammograms. For each resident, a user-specific model was constructed that predicts the user's expected level of difficulty for each presented case based on two BI-RADS image features. In the experiments, leave-one-out data handling scheme was applied to assign each case to a low-predicted-difficulty or a high-predicted-difficulty group for each resident based on each of the three user models. To evaluate whether the user model is useful in predicting difficulty, the authors performed statistical tests using the generalized estimating equations approach to determine whether the mean actual error is the same or not between the low-predicted-difficulty group and the high-predicted-difficulty group. When the results for all observers were pulled together, the actual errors made by residents were statistically significantly higher for cases in the high-predicted-difficulty group than for cases in the low-predicted-difficulty group for all modeling

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  5. COMPUTING EXPERIMENT FOR ASSESSMENT OF AERODYNAMIC CHARACTERISTICS OF SEPARATE ELEMENTS IN THE STRUCTURE OF THE FUSELAGE OF A HELICOPTER

    Directory of Open Access Journals (Sweden)

    V. A. Ivchin

    2015-01-01

    Full Text Available The present publication describes the calculation of helicopter fuselage aerodynamic characteristics and its separate elements, by computing experiment. On the basis of program commercial package CFX ANSYS the technique has been mastered and longitudinal and lateral characteristics of the helicopter fuselage on the various flight modes are calculated.

  6. Computed tomography guided needle biopsy: experience from 1,300 procedures

    Energy Technology Data Exchange (ETDEWEB)

    Chojniak, Rubens; Isberner, Rony Klaus; Viana, Luciana Marinho; Yu, Liao Shin; Aita, Alessandro Amorim; Soares, Fernando Augusto [Hospital do Cancer A.C. Camargo, Sao Paulo, SP (Brazil). Dept. de Radiologia e Patologia

    2006-01-15

    Context and objective: computed tomography (CT) guided biopsy is widely accepted as effective and safe for diagnosis in many settings. Accuracy depends on target organ and needle type. Cutting needles present advantages over fine needles. This study presents experience from CT guided biopsies performed at an oncology center. Design and setting: retrospective study at Hospital do Cancer A. C. Camargo, Sao Paulo.Methods: 1,300 consecutive CT guided biopsies performed between July 1994 and February 2000 were analyzed. Nodules or masses were suspected as primary malignancy in 845 cases (65%) or metastatic lesion in 455 (35%). 628 lesions were thoracic, 281 abdominal, 208 retroperitoneal, 134 musculoskeletal and 49 head/neck. All biopsies were performed by one radiologist or under his supervision: 765 (59%) with 22-gauge fine-needle/aspiration technique and 535 (41%) with automated 16 or 18-gauge cutting-needle biopsy. Results: adequate samples were obtained in 70-92% of fine-needle and 93-100% of cutting-needle biopsies. The specific diagnosis rates were 54-67% for fine-needle and 82-100% for cutting-needle biopsies, according to biopsy site. For any site, sample adequacy and specific diagnosis rate were always better for cutting-needle biopsy. Among 530 lung biopsies, there were 84 pneumothorax (16%) and two hemothorax (0.3%) cases, with thoracic drainage in 24 (4.9%). Among abdominal and retroperitoneal biopsies, there were two cases of major bleeding and one of peritonitis. Conclusion: both types of needle showed satisfactory results, but cutting-needle biopsy should be used when specific diagnosis is desired without greater incidence of complications. (author)

  7. Computation and experiment results of the grounding model of Three Gorges Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xishan; Zhang Yuanfang; Yu Jianhui; Chen Cixuan [Wuhan University of Hydraulic and Electrical Engineering (China); Qin Liming; Xu Jun; Shu Lianfu [Yangtze River Water Resources Commission, Wuhan (China)

    1999-07-01

    A model for the computation of the grounding parameters of the grids of Three Gorges Power Plant (TGPP) on the Yangtze River is presented in this paper. Using this model computation and analysis of grounding grids is carried out. The results show that reinforcing the grid of the dam is the main body of current dissipation. It must be reliably welded to form a good grounding grid. The experimental results show that the method and program of the computations are correct. (UK)

  8. Impact of Online/Internet Marketing in Enhancing Consumer Experience on Computer Industry (Case of Malaysia)

    OpenAIRE

    Ramin Azadavar, Solmohammad Bastam ,Hassan Dehghan Dehnavi, Hamed Armesh , Mojgan Sharifi Rayeni

    2011-01-01

    As far as businesses are concerned, the internet has been subject to a variety of experimentations that seek to determine the viability of using the internet to improve business practices in various industries especially in computer industry in Malaysia. One particular aspect of business is that the internet marketing has a great impact on computer industry in Malaysia. This research paper is concerned with making a critical examination of the impact of internet/online marketing on computer i...

  9. Static Computer Memory Integrity Testing (SCMIT): An experiment flown on STS-40 as part of GAS payload G-616

    Science.gov (United States)

    Hancock, Thomas

    1993-01-01

    This experiment investigated the integrity of static computer memory (floppy disk media) when exposed to the environment of low earth orbit. The experiment attempted to record soft-event upsets (bit-flips) in static computer memory. Typical conditions that exist in low earth orbit that may cause soft-event upsets include: cosmic rays, low level background radiation, charged fields, static charges, and the earth's magnetic field. Over the years several spacecraft have been affected by soft-event upsets (bit-flips), and these events have caused a loss of data or affected spacecraft guidance and control. This paper describes a commercial spin-off that is being developed from the experiment.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  12. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  13. Enhancing the Programming Experience for First-Year Engineering Students through Hands-On Integrated Computer Experiences

    Science.gov (United States)

    Canfield, Stephen L.; Ghafoor, Sheikh; Abdelrahman, Mohamed

    2012-01-01

    This paper describes the redesign and implementation of the course, "Introduction to Programming for Engineers" using microcontroller (MCU) hardware as the programming target. The objective of this effort is to improve the programming competency for engineering students by more closely relating the initial programming experience to the student's…

  14. Using Tablet PCs in Classroom for Teaching Human-Computer Interaction: An Experience in High Education

    Science.gov (United States)

    da Silva, André Constantino; Marques, Daniela; de Oliveira, Rodolfo Francisco; Noda, Edgar

    2014-01-01

    The use of computers in the teaching and learning process is investigated by many researches and, nowadays, due the available diversity of computing devices, tablets are become popular in classroom too. So what are the advantages and disadvantages to use tablets in classroom? How can we shape the teaching and learning activities to get the best of…

  15. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  16. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  17. Our experience in the diagnosis of aortic dissection by multislice computed tomography

    International Nuclear Information System (INIS)

    Llerena Rojas, Luis R; Mendoza Rodriguez, Vladimir; Olivares Aquiles, Eddy

    2011-01-01

    Aortic dissection (AD) is the most frequent and life-threatening acute aortic syndrome. Currently the more used method for the aortic study is the multislice computed tomography. The purpose of this paper is to expose the more relevant features in 22 patients with AD consecutively studied by multislice computed tomography

  18. An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences

    Science.gov (United States)

    Inchamnan, Wilawan

    2016-01-01

    This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…

  19. Mind the Sheep! User Experience Evaluation & Brain-Computer Interface Games

    NARCIS (Netherlands)

    Gürkök, Hayrettin

    2012-01-01

    A brain-computer interface (BCI) infers our actions (e.g. a movement), intentions (e.g. preparation for a movement) and psychological states (e.g. emotion, attention) by interpreting our brain signals. It uses the inferences it makes to manipulate a computer. Although BCIs have long been used

  20. Experiences Using an Open Source Software Library to Teach Computer Vision Subjects

    Science.gov (United States)

    Cazorla, Miguel; Viejo, Diego

    2015-01-01

    Machine vision is an important subject in computer science and engineering degrees. For laboratory experimentation, it is desirable to have a complete and easy-to-use tool. In this work we present a Java library, oriented to teaching computer vision. We have designed and built the library from the scratch with emphasis on readability and…

  1. Technological Metaphors and Moral Education: The Hacker Ethic and the Computational Experience

    Science.gov (United States)

    Warnick, Bryan R.

    2004-01-01

    This essay is an attempt to understand how technological metaphors, particularly computer metaphors, are relevant to moral education. After discussing various types of technological metaphors, it is argued that technological metaphors enter moral thought through their "functional descriptions." The computer metaphor is then explored by turning to…

  2. Computational analysis of modern HTGR fuel performance and fission product release during the HFR-EU1 irradiation experiment

    Energy Technology Data Exchange (ETDEWEB)

    Verfondern, Karl, E-mail: k.verfondern@fz-juelich.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); Xhonneux, André, E-mail: xhonneux@lrst.rwth-aachen.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); Nabielek, Heinz, E-mail: heinznabielek@me.com [Research Center Jülich, Monschauerstrasse 61, 52355 Düren (Germany); Allelein, Hans-Josef, E-mail: h.j.allelein@fz-juelich.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); RWTH Aachen, Chair for Reactor Safety and Reactor Technology, 52072 Aachen (Germany)

    2014-07-01

    Highlights: • HFR-EU1 irradiation test demonstrates high quality of HTGR spherical fuel elements. • Irradiation performance is in good agreement with German fuel performance modeling. • International benchmark exercise expected first particle to fail at ∼13–17% FIMA. • EOL silver release is predicted to be in the percentage range. • EOL cesium and strontium are expected to remain at a low level. - Abstract: Various countries engaged in the development and fabrication of modern HTGR fuel have initiated activities of modeling the fuel and fission product release behavior with the aim of predicting the fuel performance under HTGR operating and accident conditions. Verification and validation studies are conducted by code-to-code benchmarking and code-to-experiment comparisons as part of international exercises. The methodology developed in Germany since the 1980s represents valuable and efficient tools to describe fission product release from spherical fuel elements and TRISO fuel performance, respectively, under given conditions. Continued application to new results of irradiation and accident simulation testing demonstrates the appropriateness of the models in terms of a conservative estimation of the source term as part of interactions with HTGR licensing authorities. Within the European irradiation testing program for HTGR fuel and as part of the former EU RAPHAEL project, the HFR-EU1 irradiation experiment explores the potential for high performance of the presently existing German and newly produced Chinese fuel spheres under defined conditions up to high burnups. The fuel irradiation was completed in 2010. Test samples are prepared for further postirradiation examinations (PIE) including heatup simulation testing in the KÜFA-II furnace at the JRC-ITU, Karlsruhe, to be conducted within the on-going ARCHER Project of the European Commission. The paper will describe the application of the German computer models to the HFR-EU1 irradiation test and

  3. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  4. Working with previously anonymous gamete donors and donor-conceived adults: recent practice experiences of running the DNA-based voluntary information exchange and contact register, UK DonorLink.

    Science.gov (United States)

    Crawshaw, Marilyn; Gunter, Christine; Tidy, Christine; Atherton, Freda

    2013-03-01

    This article describes recent practice experiences with donor conceived adults, donors, non-donor-conceived adult children of donors using the voluntary DNA-based register, UK DonorLink. It highlights additional complexities faced when using DNA rather than paper records for searching, in particular from the risk of false positives, low chances of success and potential inclusion of biological parents' DNA. Professionals' experiences in supporting those being "linked" suggest challenges as well as rewards. Registration carries the potential to be therapeutic for donor-conceived adults and donors and to enhance their political awareness regardless of links being made. Registrants value both peer and professional support, providing the latter can respond flexibly and be delivered by staff experienced in intermediary work. Given that the majority of those affected by donor conception internationally come from anonymous donation systems, these findings are highly pertinent and argue the need for political and moral debate about such service provision.

  5. Computer models of dipole magnets of a series 'VULCAN' for the ALICE experiment

    International Nuclear Information System (INIS)

    Vodop'yanov, A.S.; Shishov, Yu.A.; Yuldasheva, M.B.; Yuldashev, O.I.

    1998-01-01

    The paper is devoted to a construction of computer models for three magnets of the 'VULCAN' series in the framework of a differential approach for two scalar potentials. The distinctive property of these magnets is that they are 'warm' and their coils are of conic saddle shape. The algorithm of creating a computer model for the coils is suggested. The coil field is computed by Biot-Savart law and a part of the integrals is calculated with the help of analytical formulas. To compute three-dimensional magnetic fields by the finite element method with a local accuracy control, two new algorithms are suggested. The former is based on a comparison of the fields computed by means of linear and quadratic shape functions. The latter is based on a comparison of the field computed with the help of linear shape functions and a local classical solution. The distributions of the local accuracy control characteristics within a working part of the third magnet and the other results of the computations are presented

  6. Dual-energy computed tomographic virtual noncalcium algorithm for detection of bone marrow edema in acute fractures: early experiences.

    Science.gov (United States)

    Reagan, Adrian C; Mallinson, Paul I; O'Connell, Timothy; McLaughlin, Patrick D; Krauss, Bernhard; Munk, Peter L; Nicolaou, Savvas; Ouellette, Hugue A

    2014-01-01

    Computed tomography (CT) is often used to assess the presence of occult fractures when plain radiographs are equivocal in the acute traumatic setting. While providing increased spatial resolution, conventional computed tomography is limited in the assessment of bone marrow edema, a finding that is readily detectable on magnetic resonance imaging (MRI).Dual-energy CT has recently been shown to demonstrate patterns of bone marrow edema similar to corresponding MRI studies. Dual-energy CT may therefore provide a convenient modality for further characterizing acute bony injury when MRI is not readily available. We report our initial experiences of 4 cases with imaging and clinical correlation.

  7. ATLAS distributed computing operation shift teams experience during the discovery year and beginning of the long shutdown 1

    International Nuclear Information System (INIS)

    Sedov, Alexey; Girolamo, Alessandro Di; Negri, Guidone; Sakamoto, Hiroshi; Schovancová, Jaroslava; Smirnov, Iouri; Vartapetian, Armen; Yu, Jaehoon

    2014-01-01

    ATLAS Distributed Computing Operation Shifts evolve to meet new requirements. New monitoring tools as well as operational changes lead to modifications in organization of shifts. In this paper we describe the structure of shifts, the roles of different shifts in ATLAS computing grid operation, the influence of a Higgs-like particle discovery on shift operation, the achievements in monitoring and automation that allowed extra focus on the experiment priority tasks, and the influence of the Long Shutdown 1 and operational changes related to the no beam period.

  8. Tensor Arithmetic, Geometric and Mathematic Principles of Fluid Mechanics in Implementation of Direct Computational Experiments

    Directory of Open Access Journals (Sweden)

    Bogdanov Alexander

    2016-01-01

    Full Text Available The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.

  9. EXPERIENCE OF USING CLOUD COMPUTING IN NETWORK PRODUCTS FOR SCHOOL EDUCATION

    Directory of Open Access Journals (Sweden)

    L. Sokolova

    2011-05-01

    Full Text Available We study data on the use of sites in the middle grades, secondary school, and their influence on the formation of information culture of students and their level of training. Sites use a technology called "cloud computing in Google, accessible from any internet-connected computer and do not require the use of resources of the computer itself. Sites are devoid of any advertising, does not require periodic backup, protection and general operation of the system administrator. This simplifies their use in the educational process for schools of different levels. A statistical analysis of the site was done, identified the main trends of their use.

  10. Experience in programming Assembly language of CDC CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Caldeira, A.D.

    1987-10-01

    Aiming to optimize processing time of BCG computer code in the CDC CYBER 170/750 computer, the FORTRAN-V language of INTERP subroutine was converted to Assembly language. The BCG code was developed for solving neutron transport equation by iterative method, and the INTERP subroutine is innermost loop of the code carrying out 5 interpolation types. The central processor unit Assembly language of the CDC CYBER 170/750 computer and its application in implementing the interpolation subroutine of BCG code are described. (M.C.K.)

  11. Stereotactic biopsy aided by a computer graphics workstation: experience with 200 consecutive cases.

    Science.gov (United States)

    Ulm, A J; Bova, F J; Friedman, W A

    2001-12-01

    The advent of modern computer technology has made it possible to examine not just the target point, but the entire trajectory in planning for stereotactic biopsies. Two hundred consecutive biopsies were performed by one surgeon, utilizing a computer graphics workstation. The target point, entry point, and complete trajectory were carefully scrutinized and adjusted to minimize potential complications. Pathologically abnormal tissue was obtained in 197 cases (98.5%). There was no mortality in this series. Symptomatic hemorrhages occurred in 4 cases (2%). Computer graphics workstations facilitate safe and effective biopsies in virtually any brain area.

  12. Learning from Experience: Creating Leadership Capabilities through Computer Simulated Leadership Challenges

    Science.gov (United States)

    Stewart, Alice C.; Black, Sylvia Sloan; Smith-Gratto, Karen; Williams, Jacqueline A.

    2007-01-01

    Leadership is often described as something that is learned from experience. However, experiences do not often occur within a controlled environment where learning and its impact can be evaluated. In this paper, we investigate the efficacy of two types of learning experiences. University students received leadership training of equal length through…

  13. Laminar Boundary-Layer Instabilities on Hypersonic Cones: Computations for Benchmark Experiments

    National Research Council Canada - National Science Library

    Robarge, Tyler W; Schneider, Steven P

    2005-01-01

    .... The STABL code package and its PSE-Chem stability solver are used to compute first and second mode instabilities for both sharp and blunt cones at wind tunnel conditions, with laminar mean flows...

  14. Integration of computer technology into the medical curriculum: the King's experience

    Directory of Open Access Journals (Sweden)

    Vickie Aitken

    1997-12-01

    Full Text Available Recently, there have been major changes in the requirements of medical education which have set the scene for the revision of medical curricula (Towle, 1991; GMC, 1993. As part of the new curriculum at King's, the opportunity has been taken to integrate computer technology into the course through Computer-Assisted Learning (CAL, and to train graduates in core IT skills. Although the use of computers in the medical curriculum has up to now been limited, recent studies have shown encouraging steps forward (see Boelen, 1995. One area where there has been particular interest is the use of notebook computers to allow students increased access to IT facilities (Maulitz et al, 1996.

  15. Assessment of medical communication skills by computer: assessment method and student experiences

    NARCIS (Netherlands)

    Hulsman, R. L.; Mollema, E. D.; Hoos, A. M.; de Haes, J. C. J. M.; Donnison-Speijer, J. D.

    2004-01-01

    BACKGROUND A computer-assisted assessment (CAA) program for communication skills designated ACT was developed using the objective structured video examination (OSVE) format. This method features assessment of cognitive scripts underlying communication behaviour, a broad range of communication

  16. New Chicago-Indiana computer network will handle dataflow from world's largest scientific experiment

    CERN Multimedia

    2006-01-01

    "Massive quantities of data will soon begin flowing from the largest scientific instrument ever built into an international netword of computer centers, including one operated jointly by the University of Chicago and Indiana University." (1,5 page)

  17. Foreign Experience in the Use of Computer Games in Teaching Children

    Directory of Open Access Journals (Sweden)

    Grigoryev I.S.,

    2017-01-01

    Full Text Available Compares games as one of the most interesting phenomena related to the computerization are the subject of many foreign and domestic psychological researches. The article presents the characteristics of the following international study destinations of computer (video games: firstly, the scope of use of computer games in education, secondly, study computer’s game influence of the cognitive domain of children, as well as formation of different skills. Such studies, however, do not consider computer games as an object, and stop only at specific areas of attention or perception. We discussed the question about common conceptual and methodological basis for the construction of research, which will classify and interpret the private research in this area. It lists the various (both positive and negative effects on the influence of computer games on the mental development of the player, their significant developmental and educational potential.

  18. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    Energy Technology Data Exchange (ETDEWEB)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens [A. C. Camargo Cancer Center, Sao Paulo, SP (Brazil); Universidade Federal do Espirito Santo (HUCAM/UFES), Vitoria, ES (Brazil). Hospital Universitario Cassiano Antonio de Morais. Radiologia e Diagnostico por Imagem

    2017-03-15

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  19. An Analysis of Creative Process Learning in Computer Game Activities Through Player Experiences

    OpenAIRE

    Wilawan Inchamnan

    2016-01-01

    This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning outcomes is described. Creative components were measured by examining task motivation and domain-relevant and creativity-relevant skill factors. The r...

  20. Optical conoscopy of distorted uniaxial liquid crystals: computer simulation and experiment

    OpenAIRE

    Yu.A.Nastishin; O.B.Dovgyi; O.G.Vlokh

    2001-01-01

    We propose an algorithm to compute the conoscopic pattern for distorted uniaxial liquid crystal cells. The computed conoscopic figures for several cells (homeotropic, planar, twist, hybrid, hybrid under an external field) are compared to the corresponding experimental conoscopic patterns. We demonstrate that conoscopy can be used for the characterization of the distorted nematic cells with the director deformations which can not be detected and unambigously characterized by direct microscopy ...

  1. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    International Nuclear Information System (INIS)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens; Universidade Federal do Espirito Santo

    2017-01-01

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  2. Computer systems experiences of users with and without disabilities an evaluation guide for professionals

    CERN Document Server

    Borsci, Simone; Federici, Stefano; Mele, Maria Laura

    2013-01-01

    This book provides the necessary tools for the evaluation of the interaction between the user who is disabled and the computer system that was designed to assist that person. The book creates an evaluation process that is able to assess the user's satisfaction with a developed system. Presenting a new theoretical perspective in the human computer interaction evaluation of disabled persons, it takes into account all of the individuals involved in the evaluation process.

  3. More than 2 years' experience with computer-aided irradiation planning in clinical routine

    International Nuclear Information System (INIS)

    Heller, H.; Rathje, J.

    1976-01-01

    This is a report on an irradiation planning system which has been used for about 2 years in the department of radiotherapy in the general hospital in Altona. Hard- and software as well as the mathematical model for the description of the dose distribution are described. The compromise between the required accuray of the irradiation plan and the investment in computer-technical activities and computer time is discussed. (orig./LN) [de

  4. Foreign Experience in the Use of Computer Games in Teaching Children

    OpenAIRE

    Grigoryev I.S.,

    2017-01-01

    Compares games as one of the most interesting phenomena related to the computerization are the subject of many foreign and domestic psychological researches. The article presents the characteristics of the following international study destinations of computer (video) games: firstly, the scope of use of computer games in education, secondly, study computer’s game influence of the cognitive domain of children, as well as formation of different skills. Such studies, however, do not consider com...

  5. Evaluation of Real-World Experience with Tofacitinib Compared with Adalimumab, Etanercept, and Abatacept in RA Patients with 1 Previous Biologic DMARD: Data from a U.S. Administrative Claims Database.

    Science.gov (United States)

    Harnett, James; Gerber, Robert; Gruben, David; Koenig, Andrew S; Chen, Connie

    2016-12-01

    Real-world data comparing tofacitinib with biologic disease-modifying antirheumatic drugs (bDMARDs) are limited. To compare characteristics, treatment patterns, and costs of patients with rheumatoid arthritis (RA) receiving tofacitinib versus the most common bDMARDs (adalimumab [ADA], etanercept [ETN], and abatacept [ABA]) following a single bDMARD in a U.S. administrative claims database. This study was a retrospective cohort analysis of patients aged ≥ 18 years with an RA diagnosis (ICD-9-CM codes 714.0x-714.4x; 714.81) and 1 previous bDMARD filling ≥ 1 tofacitinib or bDMARD claim in the Truven MarketScan Commercial and Medicare Supplemental claims databases (November 1, 2012-October 31, 2014). Monotherapy was defined as absence of conventional synthetic DMARDs within 90 days post-index. Persistence was evaluated using a 60-day gap. Adherence was assessed using proportion of days covered (PDC). RA-related total, pharmacy, and medical costs were evaluated in the 12-month pre- and post-index periods. Treatment patterns and costs were adjusted using linear models including a common set of clinically relevant variables of interest (e.g., previous RA treatments), which were assessed separately using t-tests and chi-squared tests. Overall, 392 patients initiated tofacitinib; 178 patients initiated ADA; 118 patients initiated ETN; and 191 patients initiated ABA. Tofacitinib patients were older versus ADA patients (P = 0.0153) and had a lower proportion of Medicare supplemental patients versus ABA patients (P = 0.0095). Twelve-month pre-index bDMARD use was greater in tofacitinib patients (77.6%) versus bDMARD cohorts (47.6%-59.6%). Tofacitinib patients had greater 12-month pre-index RA-related total costs versus bDMARD cohorts (all P 0.10) proportion of patients were persistent with tofacitinib (42.6%) versus ADA (37.6%), ETN (42.4%), and ABA (43.5%). Mean PDC was 0.55 for tofacitinib versus 0.57 (ADA), 0.59 (ETN), and 0.44 (ABA; P = 0.0003). Adjusted analyses

  6. Experience gained in using a computer-aided teaching system in Azov maritime institute

    Directory of Open Access Journals (Sweden)

    Олександр Миколайович Зиновченко

    2017-06-01

    Full Text Available Brief analysis of the known teaching methods through the use of computer has been given. Computer-aided teaching system includes an interactive lecture, laboratory works, an application for online testing and evaluation of the new knowledge assimilation and the software used by the teacher. The virtual lecture presents information as sound tracked dynamic pictures accompanied by permanent practical work that fixes the acquired knowledge in the student’s mind. Each teaching step in the virtual lecture is followed with practical work evaluated by the computer. Virtual labs make it possible to consolidate the new knowledge by practice. They provide for the individual activity of the student, monitor his progress and automatically evaluate his knowledge. These applications are installed in the student's computer. The computer applications of the teacher include a generator of the tests for testing and evaluation of the new knowledge, a typical problems base, personal information files generator for each student and a computer application forming the final mark of the student. The results of the testing of this teaching system show that it is efficient, making it possible to organize a flexible schedule of the educational process,cutting down the working hours of the teacher

  7. Modeling Warm Dense Matter Experiments using the 3D ALE-AMR Code and the Move Toward Exascale Computing

    International Nuclear Information System (INIS)

    Koniges, A.; Eder, E.; Liu, W.; Barnard, J.; Friedman, A.; Logan, G.; Fisher, A.; Masers, N.; Bertozzi, A.

    2011-01-01

    The Neutralized Drift Compression Experiment II (NDCX II) is an induction accelerator planned for initial commissioning in 2012. The final design calls for a 3 MeV, Li+ ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and transverse dimension of order 1 mm. The NDCX II will be used in studies of material in the warm dense matter (WDM) regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. We discuss recent efforts to adapt the 3D ALE-AMR code to model WDM experiments on NDCX II. The code, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR), has physics models that include ion deposition, radiation hydrodynamics, thermal diffusion, anisotropic material strength with material time history, and advanced models for fragmentation. Experiments at NDCX-II will explore the process of bubble and droplet formation (two-phase expansion) of superheated metal solids using ion beams. Experiments at higher temperatures will explore equation of state and heavy ion fusion beam-to-target energy coupling efficiency. Ion beams allow precise control of local beam energy deposition providing uniform volumetric heating on a timescale shorter than that of hydrodynamic expansion. The ALE-AMR code does not have any export control restrictions and is currently running at the National Energy Research Scientific Computing Center (NERSC) at LBNL and has been shown to scale well to thousands of CPUs. New surface tension models that are being implemented and applied to WDM experiments. Some of the approaches use a diffuse interface surface tension model that is based on the advective Cahn-Hilliard equations, which allows for droplet breakup in divergent velocity fields without the need for imposed perturbations. Other methods require seeding or other methods for droplet breakup. We also briefly discuss the effects of the move to exascale computing and related

  8. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    International Nuclear Information System (INIS)

    Virtanen, E.; Haapalehto, T.; Kouhia, J.

    1997-01-01

    Three experiments were conducted to study the behaviour of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes so that the results may be compared. Only the steam generator was modeled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments. (orig.)

  9. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.; Haapalehto, T. [Lappeenranta Univ. of Technology, Lappeenranta (Finland); Kouhia, J. [VTT Energy, Nuclear Energy, Lappeenranta (Finland)

    1995-09-01

    Three experiments were conducted to study the behavior of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes to that the results may be compared. Only the steam generator was modelled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments.

  10. Reconstruction and identification of electrons in the Atlas experiment. Setup of a Tier 2 of the computing grid

    International Nuclear Information System (INIS)

    Derue, F.

    2008-03-01

    The origin of the mass of elementary particles is linked to the electroweak symmetry breaking mechanism. Its study will be one of the main efforts of the Atlas experiment at the Large Hadron Collider of CERN, starting in 2008. In most cases, studies will be limited by our knowledge of the detector performances, as the precision of the energy reconstruction or the efficiency to identify particles. This manuscript presents a work dedicated to the reconstruction of electrons in the Atlas experiment with simulated data and data taken during the combined test beam of 2004. The analysis of the Atlas data implies the use of a huge amount of computing and storage resources which brought to the development of a world computing grid. (author)

  11. Modeling warm dense matter experiments using the 3D ALE-AMR code and the move toward exascale computing

    International Nuclear Information System (INIS)

    Koniges, A.; Liu, W.; Barnard, J.; Friedman, A.; Logan, G.; Eder, D.; Fisher, A.; Masters, N.; Bertozzi, A.

    2013-01-01

    The Neutralized Drift Compression Experiment II (NDCX II) is an induction accelerator planned for initial commissioning in 2012. The final design calls for a 3 MeV, Li + ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and transverse dimension of order 1 mm. The NDCX II will be used in studies of material in the warm dense matter (WDM) regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. We discuss recent efforts to adapt the 3D ALE-AMR code to model WDM experiments on NDCX II. The code, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR), has physics models that include ion deposition, radiation hydrodynamics, thermal diffusion, anisotropic material strength with material time history, and advanced models for fragmentation. Experiments at NDCX-II will explore the process of bubble and droplet formation (two-phase expansion) of superheated metal solids using ion beams. Experiments at higher temperatures will explore equation of state and heavy ion fusion beam-to-target energy coupling efficiency. Ion beams allow precise control of local beam energy deposition providing uniform volumetric heating on a timescale shorter than that of hydrodynamic expansion. We also briefly discuss the effects of the move to exascale computing and related computational changes on general modeling codes in fusion. (authors)

  12. Production of proteinase A by Saccharomyces cerevisiae in a cell-recycling fermentation system: Experiments and computer simulations

    DEFF Research Database (Denmark)

    Grøn, S.; Biedermann, K.; Emborg, Claus

    1996-01-01

    experimentally and by computer simulations. Experiments and simulations showed that cell mass and product concentration were enhanced by high ratios of recycling. Additional simulations showed that the proteinase A concentration decreased drastically at high dilution rates and the optimal volumetric...... productivities were at high dilution rates just below washout and at high ratios of recycling. Cell-recycling fermentation gave much higher volumetric productivities and stable product concentrations in contrast to simple continuous fermentation....

  13. ISLAM PROJECT: Interface between the signals from various experiments of a Van Graaff accelerator and PDP 11/44 computer

    International Nuclear Information System (INIS)

    Martinez Piquer, T. A.; Yuste Santos, C.

    1986-01-01

    This paper describe an interface between the signals from an in-beam experiment of a Van de Graaff accelerator and a PDP 11/44 computer. The information corresponding to one spectrum is taken from one digital voltammeter and is processed by mean of an equipment controlled by a M6809 microprocessor. The software package has been developed in assembly language and has a size of 1/2 K. (Author) 12 refs

  14. Films, Affective Computing and Aesthetic Experience: Identifying Emotional and Aesthetic Highlights from Multimodal Signals in a Social Setting

    OpenAIRE

    Kostoulas, Theodoros; Chanel, Guillaume; Muszynski, Michal; Lombardo, Patrizia; Pun, Thierry

    2017-01-01

    Over the last years, affective computing has been strengthening its ties with the humanities, exploring and building understanding of people’s responses to specific artistic multimedia stimuli. “Aesthetic experience” is acknowledged to be the subjective part of some artistic exposure, namely, the inner affective state of a person exposed to some artistic object. In this work, we describe ongoing research activities for studying the aesthetic experience of people when exposed to movie artistic...

  15. A versatile data handling system for nuclear physics experiments based on PDP 11/03 micro-computers

    International Nuclear Information System (INIS)

    Raaf, A.J. de

    1979-01-01

    A reliable and low cost data handling system for nuclear physics experiments is described. It is based on two PDP 11/03 micro-computers together with Gec-Elliott CAMAC equipment. For the acquisition of the experimental data a fast system has been designed. It consists of a controller for four ADCs together with an intelligent 38k MOS memory with a word size of 24 bits. (Auth.)

  16. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Science.gov (United States)

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  17. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2017-12-01

    Full Text Available Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  18. Integrating psychoeducation in a basic computer skills course for people suffering from social anxiety: participants' experiences

    Directory of Open Access Journals (Sweden)

    Löhr HD

    2011-08-01

    Full Text Available Hildegard D Löhr1,2, Jan H Rosenvinge1,3, Rolf Wynn2,41Division of General Psychiatry, University Hospital of North Norway, 2Telemedicine Research Group, Department of Clinical Medicine, Faculty of Health Sciences, 3Department of Psychology, Faculty of Health Sciences, University of Tromsø, 4Division of Addiction and Specialized Psychiatry, University Hospital of North Norway, Tromsø, NorwayAbstract: We describe a psychoeducational program integrated in a basic computer skills course for participants suffering from social anxiety. The two main aims of the course were: that the participants learn basic computer skills, and that the participants learn to cope better with social anxiety. Computer skills were taught by a qualified teacher. Psychoeducation and cognitive therapy skills, including topics such as anxiety coping, self-accept, and self-regulation, were taught by a clinical psychologist. Thirteen of 16 participants completed the course, which lasted 11 weeks. A qualitative analysis was performed, drawing on observations during the course and on interviews with the participants. The participants were positive about the integration of psychoeducation sessions in the computer course, and described positive outcomes for both elements, including improved computer skills, improved self-esteem, and reduced social anxiety. Most participants were motivated to undertake further occupational rehabilitation after the course.Keywords: cognitive therapy, information technology, occupational rehabilitation, psychoeducation, self-help, social anxiety

  19. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration

  20. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/ USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration. (author)