WorldWideScience

Sample records for previous computational studies

  1. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  2. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  3. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  4. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  5. Hispanic women overcoming deterrents to computer science: A phenomenological study

    Science.gov (United States)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty

  6. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  7. Student Engagement with Computer-Generated Feedback: A Case Study

    Science.gov (United States)

    Zhang, Zhe

    2017-01-01

    In order to benefit from feedback on their writing, students need to engage effectively with it. This article reports a case study on student engagement with computer-generated feedback, known as automated writing evaluation (AWE) feedback, in an EFL context. Differing from previous studies that explored commercially available AWE programs, this…

  8. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  9. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  10. Algebraic computing program for studying the gauge theory

    International Nuclear Information System (INIS)

    Zet, G.

    2005-01-01

    An algebraic computing program running on Maple V platform is presented. The program is devoted to the study of the gauge theory with an internal Lie group as local symmetry. The physical quantities (gauge potentials, strength tensors, dual tensors etc.) are introduced either as equations in terms of previous defined quantities (tensors), or by manual entry of the component values. The components of the strength tensor and of its dual are obtained with respect to a given metric of the space-time used for describing the gauge theory. We choose a Minkowski space-time endowed with spherical symmetry and give some example of algebraic computing that are adequate for studying electroweak or gravitational interactions. The field equations are also obtained and their solutions are determined using the DEtools facilities of the Maple V computing program. (author)

  11. A Computer Game-Based Method for Studying Bullying and Cyberbullying

    Science.gov (United States)

    Mancilla-Caceres, Juan F.; Espelage, Dorothy; Amir, Eyal

    2015-01-01

    Even though previous studies have addressed the relation between face-to-face bullying and cyberbullying, none have studied both phenomena simultaneously. In this article, we present a computer game-based method to study both types of peer aggression among youth. Study participants included fifth graders (N = 93) in two U.S. Midwestern middle…

  12. A Reflective Study into Children's Cognition When Making Computer Games

    Science.gov (United States)

    Allsop, Yasemin

    2016-01-01

    In this paper, children's mental activities when making digital games are explored. Where previous studies have mainly focused on children's learning, this study aimed to unfold the children's thinking process for learning when making computer games. As part of an ongoing larger scale study, which adopts an ethnographic approach, this research…

  13. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  14. Computed tomography study of otitis media

    International Nuclear Information System (INIS)

    Bahia, Paulo Roberto Valle; Marchiori, Edson

    1997-01-01

    The findings of computed tomography (CT) of 89 patients clinically suspected of having otitis media were studied in this work. Such results were compared to clinical diagnosis, otoscopy, surgical findings and previous data. Among the results of our analysis, we studied seven patients with acute otitis media and 83 patients with chronic otitis media. The patients with acute otitis media have undergone CT examinations to evaluate possible spread to central nervous system. The diagnosis of cholesteatoma, its extension and complications were the main indication. for chronic otitis media study. The main findings of the cholesteatomatous otitis were the occupation of the epitympanun, the bony wall destruction and the ossicular chain erosion. The CT demonstrated a great sensibility to diagnose the cholesteatoma. (author)

  15. 40 CFR 152.93 - Citation of a previously submitted valid study.

    Science.gov (United States)

    2010-07-01

    ... Data Submitters' Rights § 152.93 Citation of a previously submitted valid study. An applicant may demonstrate compliance for a data requirement by citing a valid study previously submitted to the Agency. The... the original data submitter, the applicant may cite the study only in accordance with paragraphs (b...

  16. Computed tomographic study of hormone-secreting microadenomas

    International Nuclear Information System (INIS)

    Hemminghytt, S.; Kalkhoff, R.K.; Daniels, D.L.; Williams, A.L.; Grogan, J.P.; Haughton, V.M.

    1983-01-01

    A review was made of the computed tomographic (CT) studies of 33 patients with hormone-secreting microadenomas that had been verified by transsphenoidal surgery and endocrinologic evaluation. In previous studies in small series of patients, the CT appearance of pituitary microadenomas has been reported as hypodense, isodense, and hyperdense. In this study, CT showed a region of diminished enhancement and ususally an enlarged pituitary gland in cases of prolactin-secreting adenomas. HGH- or ACTH-secreting adenomas were less consistently hypodense. It is concluded that hypodensity and enlargement in the pituitary gland are the most useful criteria for identification of microadenomas. Some technical factors that may affect the CT appearance of microadenomas and lead to conflicting reports are discussed

  17. Study on GPU Computing for SCOPE2 with CUDA

    International Nuclear Information System (INIS)

    Kodama, Yasuhiro; Tatsumi, Masahiro; Ohoka, Yasunori

    2011-01-01

    For improving safety and cost effectiveness of nuclear power plants, a core calculation code SCOPE2 has been developed, which adopts detailed calculation models such as the multi-group nodal SP3 transport calculation method in three-dimensional pin-by-pin geometry to achieve high predictability. However, it is difficult to apply the code to loading pattern optimizations since it requires much longer computation time than that of codes based on the nodal diffusion method which is widely used in core design calculations. In this study, we studied possibility of acceleration of SCOPE2 with GPU computing capability which has been recognized as one of the most promising direction of high performance computing. In the previous study with an experimental programming framework, it required much effort to convert the algorithms to ones which fit to GPU computation. It was found, however, that this conversion was tremendously difficult because of the complexity of algorithms and restrictions in implementation. In this study, to overcome this complexity, we utilized the CUDA programming environment provided by NVIDIA which is a versatile and flexible language as an extension to the C/C++ languages. It was confirmed that we could enjoy high performance without degradation of maintainability through test implementation of GPU kernels for neutron diffusion/simplified P3 equation solvers. (author)

  18. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  19. Experimental and computational development of a natural breast phantom for dosimetry studies

    International Nuclear Information System (INIS)

    Nogueira, Luciana B.; Campos, Tarcisio P.R.

    2013-01-01

    This paper describes the experimental and computational development of a natural breast phantom, anthropomorphic and anthropometric for studies in dosimetry of brachytherapy and teletherapy of breast. The natural breast phantom developed corresponding to fibroadipose breasts of women aged 30 to 50 years, presenting radiographically medium density. The experimental breast phantom was constituted of three tissue-equivalents (TE's): glandular TE, adipose TE and skin TE. These TE's were developed according to chemical composition of human breast and present radiological response to exposure. Completed the construction of experimental breast phantom this was mounted on a thorax phantom previously developed by the research group NRI/UFMG. Then the computational breast phantom was constructed by performing a computed tomography (CT) by axial slices of the chest phantom. Through the images generated by CT a computational model of voxels of the thorax phantom was developed by SISCODES computational program, being the computational breast phantom represented by the same TE's of the experimental breast phantom. The images generated by CT allowed evaluating the radiological equivalence of the tissues. The breast phantom is being used in studies of experimental dosimetry both in brachytherapy as in teletherapy of breast. Dosimetry studies by MCNP-5 code using the computational model of the phantom breast are in progress. (author)

  20. A study of Computing doctorates in South Africa from 1978 to 2014

    Directory of Open Access Journals (Sweden)

    Ian D Sanders

    2015-12-01

    Full Text Available This paper studies the output of South African universities in terms of computing-related doctorates in order to determine trends in numbers of doctorates awarded and to identify strong doctoral study research areas. Data collected from a variety of sources relating to Computing doctorates conferred since the late 1970s was used to compare the situation in Computing with that of all doctorates. The number of Computing doctorates awarded has increased considerably over the period of study. Nearly three times as many doctorates were awarded in the period 2010–2014 as in 2000–2004. The universities producing the most Computing doctorates were either previously “traditional” universities or comprehensive universities formed by amalgamating a traditional research university with a technikon. Universities of technology have not yet produced many doctorates as they do not have a strong research tradition. The analysis of topic keywords using ACM Computing classifications is preliminary but shows that professional issues are dominant in Information Systems, models are often built in Computer Science and several topics, including computing in education, are evident in both IS and CS. The relevant data is in the public domain but access is difficult as record keeping was generally inconsistent and incomplete. In addition, electronic databases at universities are not easily searchable and access to HEMIS data is limited. The database built for this paper is more inclusive in terms of discipline-related data than others.

  1. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  2. Charged-particle thermonuclear reaction rates: IV. Comparison to previous work

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.

    2010-01-01

    We compare our Monte Carlo reaction rates (see Paper II of this issue) to previous results that were obtained by using the classical method of computing thermonuclear reaction rates. For each reaction, the comparison is presented using two types of graphs: the first shows the change in reaction rate uncertainties, while the second displays our new results normalized to the previously recommended reaction rate. We find that the rates have changed significantly for almost all reactions considered here. The changes are caused by (i) our new Monte Carlo method of computing reaction rates (see Paper I of this issue), and (ii) newly available nuclear physics information (see Paper III of this issue).

  3. Study of functional-performance deficits in athletes with previous ankle sprains

    Directory of Open Access Journals (Sweden)

    hamid Babaee

    2008-04-01

    Full Text Available Abstract Background: Despite the importance of functional-performance deficits in athletes with history of ankle sprain few, studies have been carried out in this area. The aim of this research was to study relationship between previous ankle sprains and functional-performance deficits in athletes. Materials and methods: The subjects were 40 professional athletes selected through random sampling among volunteer participants in soccer, basketball, volleyball and handball teams of Lorestan province. The subjects were divided into 2 groups: Injured group (athletes with previous ankle sprains and healthy group (athletes without previous ankle sprains. In this descriptive study we used Functional-performance tests (figure 8 hop test and side hop test to determine ankle deficits and limitations. They participated in figure 8 hop test including hopping in 8 shape course with the length of 5 meters and side hop test including 10 side hop repetitions in course with the length of 30 centimeters. Time were recorded via stopwatch. Results: After data gathering and assessing information distributions, Pearson correlation was used to assess relationships, and independent T test to assess differences between variables. Finally the results showed that there is a significant relationship between previous ankle sprains and functional-performance deficits in the athletes. Conclusion: The athletes who had previous ankle sprains indicated functional-performance deficits more than healthy athletes in completion of mentioned functional-performance tests. The functional-performance tests (figure 8 hop test and side hop test are sensitive and suitable to assess and detect functional-performance deficits in athletes. Therefore we can use the figure 8 hop and side hop tests for goals such as prevention, assessment and rehabilitation of ankle sprains without spending too much money and time.

  4. Personality disorders in previously detained adolescent females: a prospective study

    NARCIS (Netherlands)

    Krabbendam, A.; Colins, O.F.; Doreleijers, T.A.H.; van der Molen, E.; Beekman, A.T.F.; Vermeiren, R.R.J.M.

    2015-01-01

    This longitudinal study investigated the predictive value of trauma and mental health problems for the development of antisocial personality disorder (ASPD) and borderline personality disorder (BPD) in previously detained women. The participants were 229 detained adolescent females who were assessed

  5. Computer games: a double-edged sword?

    Science.gov (United States)

    Sun, De-Lin; Ma, Ning; Bao, Min; Chen, Xang-Chuan; Zhang, Da-Ren

    2008-10-01

    Excessive computer game playing (ECGP) has already become a serious social problem. However, limited data from experimental lab studies are available about the negative consequences of ECGP on players' cognitive characteristics. In the present study, we compared three groups of participants (current ECGP participants, previous ECGP participants, and control participants) on a Multiple Object Tracking (MOT) task. The previous ECGP participants performed significantly better than the control participants, which suggested a facilitation effect of computer games on visuospatial abilities. Moreover, the current ECGP participants performed significantly worse than the previous ECGP participants. This more important finding indicates that ECGP may be related to cognitive deficits. Implications of this study are discussed.

  6. Comparison of Swedish and Norwegian Use of Cone-Beam Computed Tomography: a Questionnaire Study

    Directory of Open Access Journals (Sweden)

    Jerker Edén Strindberg

    2015-12-01

    Full Text Available Objectives: Cone-beam computed tomography in dentistry can be used in some countries by other dentists than specialists in radiology. The frequency of buying cone-beam computed tomography to examine patients is rapidly growing, thus knowledge of how to use it is very important. The aim was to compare the outcome of an investigation on the use of cone-beam computed tomography in Sweden with a previous Norwegian study, regarding specifically technical aspects. Material and Methods: The questionnaire contained 45 questions, including 35 comparable questions to Norwegian clinics one year previous. Results were based on inter-comparison of the outcome from each of the two questionnaire studies. Results: Responses rate was 71% in Sweden. There, most of cone-beam computed tomography (CBCT examinations performed by dental nurses, while in Norway by specialists. More than two-thirds of the CBCT units had a scout image function, regularly used in both Sweden (79% and Norway (75%. In Sweden 4% and in Norway 41% of the respondents did not wait for the report from the radiographic specialist before initiating treatment. Conclusions: The bilateral comparison showed an overall similarity between the two countries. The survey gave explicit and important knowledge of the need for education and training of the whole team, since radiation dose to the patient could vary a lot for the same kind of radiographic examination. It is essential to establish quality assurance protocols with defined responsibilities in the team in order to maintain high diagnostic accuracy for all examinations when using cone-beam computed tomography for patient examinations.

  7. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  8. New accountant job market reform by computer algorithm: an experimental study

    Directory of Open Access Journals (Sweden)

    Hirose Yoshitaka

    2017-01-01

    Full Text Available The purpose of this study is to examine the matching of new accountants with accounting firms in Japan. A notable feature of the present study is that it brings a computer algorithm to the job-hiring task. Job recruitment activities for new accountants in Japan are one-time, short-term struggles. Accordingly, many have searched for new rules to replace the current ones of the process. Job recruitment activities for new accountants in Japan change every year. This study proposes modifying these job recruitment activities by combining computer and human efforts. Furthermore, the study formulates the job recruitment activities by using a model and conducting experiments. As a result, the Deferred Acceptance (DA algorithm derives a high truth-telling percentage, a stable matching percentage, and greater efficiency compared with the previous approach. This suggests the potential of the Deferred Acceptance algorithm as a replacement for current approaches. In terms of accurate percentage and stability, the DA algorithm is superior to the current methods and should be adopted.

  9. Matched cohort study of external cephalic version in women with previous cesarean delivery.

    Science.gov (United States)

    Keepanasseril, Anish; Anand, Keerthana; Soundara Raghavan, Subrahmanian

    2017-07-01

    To evaluate the efficacy and safety of external cephalic version (ECV) among women with previous cesarean delivery. A retrospective study was conducted using data for women with previous cesarean delivery and breech presentation who underwent ECV at or after 36 weeks of pregnancy during 2011-2016. For every case, two multiparous women without previous cesarean delivery who underwent ECV and were matched for age and pregnancy duration were included. Characteristics and outcomes were compared between groups. ECV was successful for 32 (84.2%) of 38 women with previous cesarean delivery and 62 (81.6%) in the control group (P=0.728). Multivariate regression analysis confirmed that previous cesarean was not associated with ECV success (odds ratio 1.89, 95% confidence interval 0.19-18.47; P=0.244). Successful vaginal delivery after successful ECV was reported for 19 (59.4%) women in the previous cesarean delivery group and 52 (83.9%) in the control group (P<0.001). No ECV-associated complications occurred in women with previous cesarean delivery. To avoid a repeat cesarean delivery, ECV can be offered to women with breech presentation and previous cesarean delivery who are otherwise eligible for a trial of labor. © 2017 International Federation of Gynecology and Obstetrics.

  10. Computed tomographic study of 50 patients with hypodense hepatic injuries in childhood

    International Nuclear Information System (INIS)

    Pereira, Ines Minniti Rodrigues; Alvares, Beatriz Regina; Baracat, Jamal; Martins, Daniel Lahan; Pereira, Ricardo Minniti Rodrigues

    2006-01-01

    Objective: To describe the different tomographic findings in hypodense hepatic lesions in children and its differential diagnosis. Materials and methods: computed tomographic studies were obtained from 50 patients (age range: 0-16 years) with low-density liver lesions previously diagnosed by ultrasound. Images were made before and after administration of intravenous contrast medium. Image findings were analyzed and afterwards correlated with anatomopathological diagnosis. Results: forty-seven of 50 cases were confirmed, 30 by anatomopathological diagnosis. Most of then were benign lesions, hemangioma in 20%. Such lesions presented a homogeneous contrast absorption, mainly at the delayed phase, differing from malignant lesions. Metastasis was the most frequently found malignant lesion (18%). Conclusion: computed tomographic study is of great value in complementing the diagnosis of hypodense hepatic lesions in children, and must follow ultrasound diagnosis as a routine procedure. (author)

  11. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Bae, Jun Ho; Park, Joo Hwan

    2010-01-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect the detailed shape of rod bundle on the numerical computation due to a lot of computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers, bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve the complex geometry such as a fuel rod bundle. In front of applying the method to the problem of 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to the simple geometry. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for the future works

  12. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  13. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  14. De novo adamantinomatous craniopharyngioma presenting anew in an elderly patient with previous normal CT and MRI studies: A case report and implications on pathogenesis

    Directory of Open Access Journals (Sweden)

    Amy Walker, B.S.

    2015-09-01

    Full Text Available Adamantinomatous craniopharyngiomas are histologically benign epithelial tumors which arise from embryonic remnants of the craniopharyngeal duct and Rathke’s pouch. They are thought to have a congenital origin and are histologically unique from papillary craniopharyngioma. We describe the case of an elderly male who presented with symptoms related to a large craniopharyngioma with previously normal brain magnetic resonance and computed tomography imaging studies. These findings dispute the embryogenic theory that craniopharyngiomas observed in adults develop from the persistent slow growth of embryonic remnants.

  15. The study of Kruskal's and Prim's algorithms on the Multiple Instruction and Single Data stream computer system

    Directory of Open Access Journals (Sweden)

    A. Yu. Popov

    2015-01-01

    Full Text Available Bauman Moscow State Technical University is implementing a project to develop operating principles of computer system having radically new architecture. A developed working model of the system allowed us to evaluate an efficiency of developed hardware and software. The experimental results presented in previous studies, as well as the analysis of operating principles of new computer system permit to draw conclusions regarding its efficiency in solving discrete optimization problems related to processing of sets.The new architecture is based on a direct hardware support of operations of discrete mathematics, which is reflected in using the special facilities for processing of sets and data structures. Within the framework of the project a special device was designed, i.e. a structure processor (SP, which improved the performance, without limiting the scope of applications of such a computer system.The previous works presented the basic principles of the computational process organization in MISD (Multiple Instructions, Single Data system, showed the structure and features of the structure processor and the general principles to solve discrete optimization problems on graphs.This paper examines two search algorithms of the minimum spanning tree, namely Kruskal's and Prim's algorithms. It studies the implementations of algorithms for two SP operation modes: coprocessor mode and MISD one. The paper presents results of experimental comparison of MISD system performance in coprocessor mode with mainframes.

  16. Computing Educator Attitudes about Motivation

    OpenAIRE

    Settle, Amber; Sedlak, Brian

    2016-01-01

    While motivation is of great interest to computing educators, relatively little work has been done on understanding faculty attitudes toward student motivation. Two previous qualitative studies of instructor attitudes found results identical to those from other disciplines, but neither study considered whether instructors perceive student motivation to be more important in certain computing classes. In this work we present quantitative results about the perceived importance of student motivat...

  17. A study of computer-related upper limb discomfort and computer vision syndrome.

    Science.gov (United States)

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  18. Computed tomographic study of aged schizophrenic patients

    International Nuclear Information System (INIS)

    Seno, Haruo; Fujimoto, Akihiko; Ishino, Hiroshi; Shibata, Masahiro; Kuroda, Hiroyuki; Kanno, Hiroshi.

    1997-01-01

    The width of interhemispheric fissure, lateral ventricles and third ventricle were measured using cranial computed tomography (CT; linear method) in 45 elderly inpatients with chronic schizophrenia and in 28 age-matched control subjects. Twenty-three patients were men and 22 were women. In addition, Mini-Mental State Examination, Brief Psychiatric Rating Scale (BPRS) and a subclass of BPRS were undertaken in all patients. There is a significant enlargement of the maximum width of the interhemispheric fissure (in both male and female) and a significant enlargement of ventricular system (more severe in men than in women) in aged schizophrenics, as seen with CT, compared with normal controls. These findings are consistent with previous studies of non-aged schizophrenic patients. Based upon the relation between psychiatric symptoms and CT findings, the most striking is a significant negative correlation between the third ventricle enlargement and the positive and depressive symptoms in all patients. This result suggests that the advanced third ventricle enlargement may decrease these symptoms in aged schizophrenics. (author)

  19. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  20. Educational NASA Computational and Scientific Studies (enCOMPASS)

    Science.gov (United States)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and

  1. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Park, Joo Hwan

    2010-09-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect numerical computations on the detailed shape of rod bundle due to challenges with computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers and bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve complex geometry such as a fuel rod bundle. Before applying a solution to the problem of the 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to simple geometry. The split channel method has been proposed with the aim of computing the fully shaped CANDU fuel channel with detailed components. The validity was tested by applying the method to the single channel problem. The average temperature have similar values for the considered two methods, while the local temperature shows a slight difference by the effect of conduction heat transfer in the solid region of a rod. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for future work

  2. Computational Study of Hypersonic Boundary Layer Stability on Cones

    Science.gov (United States)

    Gronvall, Joel Edwin

    Due to the complex nature of boundary layer laminar-turbulent transition in hypersonic flows and the resultant effect on the design of re-entry vehicles, there remains considerable interest in developing a deeper understanding of the underlying physics. To that end, the use of experimental observations and computational analysis in a complementary manner will provide the greatest insights. It is the intent of this work to provide such an analysis for two ongoing experimental investigations. The first focuses on the hypersonic boundary layer transition experiments for a slender cone that are being conducted at JAXA's free-piston shock tunnel HIEST facility. Of particular interest are the measurements of disturbance frequencies associated with transition at high enthalpies. The computational analysis provided for these cases included two-dimensional CFD mean flow solutions for use in boundary layer stability analyses. The disturbances in the boundary layer were calculated using the linear parabolized stability equations. Estimates for transition locations, comparisons of measured disturbance frequencies and computed frequencies, and a determination of the type of disturbances present were made. It was found that for the cases where the disturbances were measured at locations where the flow was still laminar but nearly transitional, that the highly amplified disturbances showed reasonable agreement with the computations. Additionally, an investigation of the effects of finite-rate chemistry and vibrational excitation on flows over cones was conducted for a set of theoretical operational conditions at the HIEST facility. The second study focuses on transition in three-dimensional hypersonic boundary layers, and for this the cone at angle of attack experiments being conducted at the Boeing/AFOSR Mach-6 quiet tunnel at Purdue University were examined. Specifically, the effect of surface roughness on the development of the stationary crossflow instability are investigated

  3. Dissociation in decision bias mechanism between probabilistic information and previous decision

    Directory of Open Access Journals (Sweden)

    Yoshiyuki eKaneko

    2015-05-01

    Full Text Available Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime. Using functional magnetic resonance imaging, we also found that activation in the left intraparietal sulcus was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target.

  4. Dissociation in decision bias mechanism between probabilistic information and previous decision

    Science.gov (United States)

    Kaneko, Yoshiyuki; Sakai, Katsuyuki

    2015-01-01

    Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime). Using functional magnetic resonance imaging (fMRI), we also found that activation in the left intraparietal sulcus (IPS) was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus (IFG) was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target. PMID:25999844

  5. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  6. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    Science.gov (United States)

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  7. Quantitative Study on Computer Self-Efficacy and Computer Anxiety Differences in Academic Major and Residential Status

    Science.gov (United States)

    Binkley, Zachary Wayne McClellan

    2017-01-01

    This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…

  8. Study guide to accompany computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Study Guide to Accompany Computer and Data Processing provides information pertinent to the fundamental aspects of computers and computer technology. This book presents the key benefits of using computers.Organized into five parts encompassing 19 chapters, this book begins with an overview of the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. This text then introduces computer hardware and describes the processor. Other chapters describe how microprocessors are made and describe the physical operation of computers. This book discusses as w

  9. Evaluation of the optimum region for mammographic system using computer simulation to study modulation transfer functions

    International Nuclear Information System (INIS)

    Oliveira, Isaura N. Sombra; Schiable, Homero; Porcel, Naider T.; Frere, Annie F.; Marques, Paulo M.A.

    1996-01-01

    An investigation of the 'optimum region' of the radiation field considering mammographic systems is studied. Such a region was defined in previous works as the field range where the system has its best performance and sharpest images. This study is based on a correlation of two methods for evaluating radiologic imaging systems, both using computer simulation in order to determine modulation transfer functions (MTFs) due to the X-ray tube focal spot in several field orientation and locations

  10. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  11. Students' Computing Use and Study: When More is Less

    Directory of Open Access Journals (Sweden)

    Christine A McLachlan

    2016-02-01

    Full Text Available Since the turn of the century there has been a steady decline in enrolments of students in senior secondary computing classes in Australia. A flow on effect has seen reduced enrolments in tertiary computing courses and the subsequent predictions of shortages in skilled computing professionals. This paper investigates the relationship between students’ computing literacy levels, their use and access to computing tools, and students’ interest in and attitudes to formal computing study. Through the use of secondary data obtained from Australian and international reports, a reverse effect was discovered indicating that the more students used computing tools, the less interested they become in computing studies. Normal 0 false false false EN-AU X-NONE X-NONE

  12. Study of some physical aspects previous to design of an exponential experiment

    International Nuclear Information System (INIS)

    Caro, R.; Francisco, J. L. de

    1961-01-01

    This report presents the theoretical study of some physical aspects previous to the design of an exponential facility. The are: Fast and slow flux distribution in the multiplicative medium and in the thermal column, slowing down in the thermal column, geometrical distribution and minimum needed intensity of sources access channels and perturbations produced by possible variations in its position and intensity. (Author) 4 refs

  13. Radon anomalies prior to earthquakes (1). Review of previous studies

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    The relationship between radon anomalies and earthquakes has been studied for more than 30 years. However, most of the studies dealt with radon in soil gas or in groundwater. Before the 1995 Hyogoken-Nanbu earthquake, an anomalous increase of atmospheric radon was observed at Kobe Pharmaceutical University. The increase was well fitted with a mathematical model related to earthquake fault dynamics. This paper reports the significance of this observation, reviewing previous studies on radon anomaly before earthquakes. Groundwater/soil radon measurements for earthquake prediction began in 1970's in Japan as well as foreign countries. One of the most famous studies in Japan is groundwater radon anomaly before the 1978 Izu-Oshima-kinkai earthquake. We have recognized the significance of radon in earthquake prediction research, but recently its limitation was also pointed out. Some researchers are looking for a better indicator for precursors; simultaneous measurements of radon and other gases are new trials in recent studies. Contrary to soil/groundwater radon, we have not paid much attention to atmospheric radon before earthquakes. However, it might be possible to detect precursors in atmospheric radon before a large earthquake. In the next issues, we will discuss the details of the anomalous atmospheric radon data observed before the Hyogoken-Nanbu earthquake. (author)

  14. Improving learning with science and social studies text using computer-based concept maps for students with disabilities.

    Science.gov (United States)

    Ciullo, Stephen; Falcomata, Terry S; Pfannenstiel, Kathleen; Billingsley, Glenna

    2015-01-01

    Concept maps have been used to help students with learning disabilities (LD) improve literacy skills and content learning, predominantly in secondary school. However, despite increased access to classroom technology, no previous studies have examined the efficacy of computer-based concept maps to improve learning from informational text for students with LD in elementary school. In this study, we used a concurrent delayed multiple probe design to evaluate the interactive use of computer-based concept maps on content acquisition with science and social studies texts for Hispanic students with LD in Grades 4 and 5. Findings from this study suggest that students improved content knowledge during intervention relative to a traditional instruction baseline condition. Learning outcomes and social validity information are considered to inform recommendations for future research and the feasibility of classroom implementation. © The Author(s) 2014.

  15. Effect of computer game playing on baseline laparoscopic simulator skills.

    Science.gov (United States)

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  16. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  17. Fire Risk Scoping Study: Investigation of nuclear power plant fire risk, including previously unaddressed issues

    International Nuclear Information System (INIS)

    Lambright, J.A.; Nowlen, S.P.; Nicolette, V.F.; Bohn, M.P.

    1989-01-01

    An investigation of nuclear power plant fire risk issues raised as a result of the USNRC sponsored Fire Protection Research Program at Sandia National Laboratories has been performed. The specific objectives of this study were (1) to review and requantify fire risk scenarios from four fire probabilistic risk assessments (PRAs) in light of updated data bases made available as a result of USNRC sponsored Fire Protection Research Program and updated computer fire modeling capabilities, (2) to identify potentially significant fire risk issues that have not been previously addressed in a fire risk context and to quantify the potential impact of those identified fire risk issues where possible, and (3) to review current fire regulations and plant implementation practices for relevance to the identified unaddressed fire risk issues. In performance of the fire risk scenario requantifications several important insights were gained. It was found that utilization of a more extensive operational experience base resulted in both fire occurrence frequencies and fire duration times (i.e., time required for fire suppression) increasing significantly over those assumed in the original works. Additionally, some thermal damage threshold limits assumed in the original works were identified as being nonconservative based on more recent experimental data. Finally, application of the COMPBRN III fire growth model resulted in calculation of considerably longer fire damage times than those calculated in the original works using COMPBRN I. 14 refs., 2 figs., 16 tabs

  18. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  19. [Usage patterns of internet and computer games : Results of an observational study of Tyrolean adolescents].

    Science.gov (United States)

    Riedl, David; Stöckl, Andrea; Nussbaumer, Charlotte; Rumpold, Gerhard; Sevecke, Kathrin; Fuchs, Martin

    2016-12-01

    The use of digital media such as the Internet and Computer games has greatly increased. In the western world, almost all young people regularly use these relevant technologies. Against this background, forms of use with possible negative consequences for young people have been recognized and scientifically examined. The aim of our study was therefore to investigate the prevalence of pathological use of these technologies in a sample of young Tyrolean people. 398 students (average age 15.2 years, SD ± 2.3 years, 34.2% female) were interviewed by means of the structured questionnaires CIUS (Internet), CSV-S (Computer games) and SWE (Self efficacy). Additionally, socio demographic data were collected. In line with previous studies, 7.7% of the adolescents of our sample showed criteria for problematic internet use, 3.3% for pathological internet use. 5.4% of the sample reported pathological computer game usage. The most important aspect to influence our results was the gender of the subjects. Intensive users in the field of Internet and Computer games were more often young men, young women, however, showed significantly less signs of pathological computer game use. A significant percentage of Tyrolean adolescents showed difficulties in the development of competent media use, indicating the growing significance of prevention measures such as media education. In a follow-up project, a sample of adolescents with mental disorders will be examined concerning their media use and be compared with our school-sample.

  20. Tingling/numbness in the hands of computer users: neurophysiological findings from the NUDATA study

    DEFF Research Database (Denmark)

    Overgaard, E.; Brandt, L. P.; Ellemann, K.

    2004-01-01

    OBJECTIVES: To investigate whether tingling/numbness of the hands and fingers among computer users is associated with elevated vibration threshold as a sign of early nerve compression. METHODS: Within the Danish NUDATA study, vibratory sensory testing with monitoring of the digital vibration...... once a week or daily within the last 3 months. Participants with more than slight muscular pain or disorders of the neck and upper extremities, excessive alcohol consumption, previous injuries of the upper extremities, or concurrent medical diseases were excluded. The two groups had a similar amount...... of work with mouse, keyboard, and computer. RESULTS: Seven of the 20 cases (35%) had elevated vibration thresholds, compared with 3 of the 20 controls (15%); this difference was not statistically significant (chi2=2.13, P=0.14). Compared with controls, cases had increased perception threshold for all...

  1. Computer use and stress, sleep disturbances, and symptoms of depression among young adults--a prospective cohort study.

    Science.gov (United States)

    Thomée, Sara; Härenstam, Annika; Hagberg, Mats

    2012-10-22

    We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. The study group was a cohort of young adults (n = 4163), 20-24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU) in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs) were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases) at 1-year follow-up for the men and women separately. Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively) associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing sleep was associated with most mental health outcomes for both men

  2. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  3. Ground-glass opacity: High-resolution computed tomography and 64-multi-slice computed tomography findings comparison

    International Nuclear Information System (INIS)

    Sergiacomi, Gianluigi; Ciccio, Carmelo; Boi, Luca; Velari, Luca; Crusco, Sonia; Orlacchio, Antonio; Simonetti, Giovanni

    2010-01-01

    Objective: Comparative evaluation of ground-glass opacity using conventional high-resolution computed tomography technique and volumetric computed tomography by 64-row multi-slice scanner, verifying advantage of volumetric acquisition and post-processing technique allowed by 64-row CT scanner. Methods: Thirty-four patients, in which was assessed ground-glass opacity pattern by previous high-resolution computed tomography during a clinical-radiological follow-up for their lung disease, were studied by means of 64-row multi-slice computed tomography. Comparative evaluation of image quality was done by both CT modalities. Results: It was reported good inter-observer agreement (k value 0.78-0.90) in detection of ground-glass opacity with high-resolution computed tomography technique and volumetric Computed Tomography acquisition with moderate increasing of intra-observer agreement (k value 0.46) using volumetric computed tomography than high-resolution computed tomography. Conclusions: In our experience, volumetric computed tomography with 64-row scanner shows good accuracy in detection of ground-glass opacity, providing a better spatial and temporal resolution and advanced post-processing technique than high-resolution computed tomography.

  4. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release with unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway

  5. An fMRI study of neuronal activation in schizophrenia patients with and without previous cannabis use

    Directory of Open Access Journals (Sweden)

    Else-Marie eLøberg

    2012-10-01

    Full Text Available Previous studies have mostly shown positive effects of cannabis use on cognition in patients with schizophrenia, which could reflect lower neurocognitive vulnerability. There are however no studies comparing whether such cognitive differences have neuronal correlates. Thus, the aim of the present study was to compare whether patients with previous cannabis use differ in brain activation from patients who has never used cannabis. The patients groups were compared on the ability to up-regulate an effort mode network during a cognitive task and down-regulate activation in the same network during a task-absent condition. Task-present and task-absent brain activation was measured by functional magnetic resonance neuroimaging (fMRI. Twenty-six patients with a DSM-IV and ICD-10 diagnosis of schizophrenia were grouped into a previous cannabis user group and a no-cannabis group. An auditory dichotic listening task with instructions of attention focus on either the right or left ear stimulus was used to tap verbal processing, attention and cognitive control, calculated as an aggregate score. When comparing the two groups, there were remaining activations in the task-present condition for the cannabis group, not seen in the no-cannabis group, while there was remaining activation in the task-absent condition for the no-cannabis group, not seen in the cannabis group. Thus, the patients with previous cannabis use showed increased activation in an effort mode network and decreased activation in the default mode network as compared to the no-cannabis group. It is concluded that the present study show some differences in brain activation to a cognitively challenging task between previous cannabis and no-cannabis schizophrenia patients.

  6. A Codesign Case Study in Computer Graphics

    DEFF Research Database (Denmark)

    Brage, Jens P.; Madsen, Jan

    1994-01-01

    The paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  7. Computational studies of tokamak plasmas

    International Nuclear Information System (INIS)

    Takizuka, Tomonori; Tsunematsu, Toshihide; Tokuda, Shinji

    1981-02-01

    Computational studies of tokamak plasmas are extensively advanced. Many computational codes have been developed by using several kinds of models, i.e., the finite element formulation of MHD equations, the time dependent multidimensional fluid model, and the particle model with the Monte-Carlo method. These codes are applied to the analyses of the equilibrium of an axisymmetric toroidal plasma (SELENE), the time evolution of the high-beta tokamak plasma (APOLLO), the low-n MHD stability (ERATO-J) and high-n ballooning mode stability (BOREAS) in the INTOR tokamak, the nonlinear MHD stability, such as the positional instability (AEOLUS-P), resistive internal mode (AEOLUS-I) etc., and the divertor functions. (author)

  8. Computational study of performance characteristics for truncated conical aerospike nozzles

    Science.gov (United States)

    Nair, Prasanth P.; Suryan, Abhilash; Kim, Heuy Dong

    2017-12-01

    Aerospike nozzles are advanced rocket nozzles that can maintain its aerodynamic efficiency over a wide range of altitudes. It belongs to class of altitude compensating nozzles. A vehicle with an aerospike nozzle uses less fuel at low altitudes due to its altitude adaptability, where most missions have the greatest need for thrust. Aerospike nozzles are better suited to Single Stage to Orbit (SSTO) missions compared to conventional nozzles. In the current study, the flow through 20% and 40% aerospike nozzle is analyzed in detail using computational fluid dynamics technique. Steady state analysis with implicit formulation is carried out. Reynolds averaged Navier-Stokes equations are solved with the Spalart-Allmaras turbulence model. The results are compared with experimental results from previous work. The transition from open wake to closed wake happens in lower Nozzle Pressure Ratio for 20% as compared to 40% aerospike nozzle.

  9. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  10. Data from studies of previous radioactive waste disposal in Massachusetts Bay

    International Nuclear Information System (INIS)

    Curtis, W.R.; Mardis, H.M.

    1984-12-01

    This report presents the results of studies conducted in Massachusetts Bay during 1981 and 1982. Included are data from: (1) a side scan sonar survey of disposal areas in the Bay that was carried out by the National Oceanic and Atmospheric Administration (NOAA) for EPA; (2) Collections of sediment and biota by NOAA for radiochemical analysis by EPA; (3) collections of marketplace seafood samples by the Food and Drug Administration (FDA) for radioanalysis by both FDA and EPA; and (4) a radiological monitoring survey of LLW disposal areas by EPA to determine whether there should be any concern for public health resulting from previous LLW disposals in the Bay

  11. Non-Determinism: An Abstract Concept in Computer Science Studies

    Science.gov (United States)

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  12. Exploring the use of tablet computer-based electronic data capture system to assess patient reported measures among patients with chronic kidney disease: a pilot study.

    Science.gov (United States)

    Wong, Dorothy; Cao, Shen; Ford, Heather; Richardson, Candice; Belenko, Dmitri; Tang, Evan; Ugenti, Luca; Warsmann, Eleanor; Sissons, Amanda; Kulandaivelu, Yalinie; Edwards, Nathaniel; Novak, Marta; Li, Madeline; Mucsi, Istvan

    2017-12-06

    Collecting patient reported outcome measures (PROMs) via computer-based electronic data capture system may improve feasibility and facilitate implementation in clinical care. We report our initial experience about the acceptability of touch-screen tablet computer-based, self-administered questionnaires among patients with chronic kidney disease (CKD), including stage 5 CKD treated with renal replacement therapies (RRT) (either dialysis or transplant). We enrolled a convenience sample of patients with stage 4 and 5 CKD (including patients on dialysis or after kidney transplant) in a single-centre, cross-sectional pilot study. Participants completed validated questionnaires programmed on an electronic data capture system (DADOS, Techna Inc., Toronto) on tablet computers. The primary objective was to evaluate the acceptability and feasibility of using tablet-based electronic data capture in patients with CKD. Descriptive statistics, Fischer's exact test and multivariable logistic regression models were used for data analysis. One hundred and twenty one patients (55% male, mean age (± SD) of 58 (±14) years, 49% Caucasian) participated in the study. Ninety-two percent of the respondents indicated that the computer tablet was acceptable and 79% of the participants required no or minimal help for completing the questionnaires. Acceptance of tablets was lower among patients 70 years or older (75% vs. 95%; p = 0.011) and with little previous computer experience (81% vs. 96%; p = 0.05). Furthermore, a greater level of assistance was more frequently required by patients who were older (45% vs. 15%; p = 0.009), had lower level of education (33% vs. 14%; p = 0.027), low health literacy (79% vs. 12%; p = 0.027), and little previous experience with computers (52% vs. 10%; p = 0.027). Tablet computer-based electronic data capture to administer PROMs was acceptable and feasible for most respondents and could therefore be used to systematically assess PROMs

  13. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  14. Computer use and stress, sleep disturbances, and symptoms of depression among young adults – a prospective cohort study

    Directory of Open Access Journals (Sweden)

    Thomée Sara

    2012-10-01

    Full Text Available Abstract Background We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. Methods The study group was a cohort of young adults (n = 4163, 20–24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases at 1-year follow-up for the men and women separately. Results Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Conclusions Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing

  15. Computed tomography study of otitis media; A tomografia computadorizada no estudo das otites medias

    Energy Technology Data Exchange (ETDEWEB)

    Bahia, Paulo Roberto Valle; Marchiori, Edson [Universidade Federal, Rio de Janeiro, RJ (Brazil). Dept. de Radiologia

    1997-03-01

    The findings of computed tomography (CT) of 89 patients clinically suspected of having otitis media were studied in this work. Such results were compared to clinical diagnosis, otoscopy, surgical findings and previous data. Among the results of our analysis, we studied seven patients with acute otitis media and 83 patients with chronic otitis media. The patients with acute otitis media have undergone CT examinations to evaluate possible spread to central nervous system. The diagnosis of cholesteatoma, its extension and complications were the main indication. for chronic otitis media study. The main findings of the cholesteatomatous otitis were the occupation of the epitympanun, the bony wall destruction and the ossicular chain erosion. The CT demonstrated a great sensibility to diagnose the cholesteatoma. (author) 25 refs., 10 figs.

  16. Computer usage and national energy consumption: Results from a field-metering study

    Energy Technology Data Exchange (ETDEWEB)

    Desroches, Louis-Benoit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Greenblatt, Jeffery [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Pratt, Stacy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Willem, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Claybaugh, Erin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Beraki, Bereket [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Nagaraju, Mythri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Young, Scott [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division

    2014-12-01

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Bay Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power

  17. Writing Apprehension, Computer Anxiety and Telecomputing: A Pilot Study.

    Science.gov (United States)

    Harris, Judith; Grandgenett, Neal

    1992-01-01

    A study measured graduate students' writing apprehension and computer anxiety levels before and after using electronic mail, computer conferencing, and remote database searching facilities during an educational technology course. Results indicted postcourse computer anxiety levels significantly related to usage statistics. Precourse writing…

  18. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  19. Metrological study of CFRP drilled holes with x-ray computed tomography

    OpenAIRE

    Kourra, Nadia; Warnett, Jason M.; Attridge, Alex; Kiraci, Ercihan; Gupta, Aniruddha; Barnes, Stuart; Williams, M. A. (Mark A.)

    2015-01-01

    The popularity of composite materials is continuously growing with new varieties being developed and tested with different machining processes to establish their suitability. Destructive as well as non-destructive methods, such as ultrasonics, X-ray radiography and eddy-current, have previously been used to ensure that the combination of particular machining methods and composites provide the required quality that can allow the required lifespan of the final product. X-ray computed tomography...

  20. Enhanced diagnostic of skin conditions by polarized laser speckles: phantom studies and computer modeling

    Science.gov (United States)

    Tchvialeva, Lioudmila; Lee, Tim K.; Markhvida, Igor; Zeng, Haishan; Doronin, Alexander; Meglinski, Igor

    2014-03-01

    The incidence of the skin melanoma, the most commonly fatal form of skin cancer, is increasing faster than any other potentially preventable cancer. Clinical practice is currently hampered by the lack of the ability to rapidly screen the functional and morphological properties of tissues. In our previous study we show that the quantification of scattered laser light polarization provides a useful metrics for diagnostics of the malignant melanoma. In this study we exploit whether the image speckle could improve skin cancer diagnostic in comparison with the previously used free-space speckle. The study includes skin phantom measurements and computer modeling. To characterize the depolarization of light we measure the spatial distribution of speckle patterns and analyse their depolarization ratio taken into account radial symmetry. We examine the dependences of depolarization ratio vs. roughness for phantoms which optical properties are of the order of skin lesions. We demonstrate that the variation in bulk optical properties initiates the assessable changes in the depolarization ratio. We show that image speckle differentiates phantoms significantly better than free-space speckle. The results of experimental measurements are compared with the results of Monte Carlo simulation.

  1. Integrating user studies into computer graphics-related courses.

    Science.gov (United States)

    Santos, B S; Dias, P; Silva, S; Ferreira, C; Madeira, J

    2011-01-01

    This paper presents computer graphics. Computer graphics and visualization are essentially about producing images for a target audience, be it the millions watching a new CG-animated movie or the small group of researchers trying to gain insight into the large amount of numerical data resulting from a scientific experiment. To ascertain the final images' effectiveness for their intended audience or the designed visualizations' accuracy and expressiveness, formal user studies are often essential. In human-computer interaction (HCI), such user studies play a similar fundamental role in evaluating the usability and applicability of interaction methods and metaphors for the various devices and software systems we use.

  2. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  3. Impact of computer use on children's vision.

    Science.gov (United States)

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  4. Synthesis, characterization and computational studies of (E-2-{[(2-aminopyridin-3-ylimino]-methyl}-4,6-di-tert-butylphenol

    Directory of Open Access Journals (Sweden)

    Alexander Carreño

    2014-01-01

    Full Text Available (E-2-{[(2-Aminopyridin-3-ylimino]-methyl}-4,6-di-tert-butyl-phenol ( 3: , a ligand containing an intramolecular hydrogen bond, was prepared according to a previous literature report, with modifications, and was characterized by UV-vis, FTIR, ¹H-NMR, 13C-NMR, HHCOSY, TOCSY and cyclic voltammetry. Computational analyses at the level of DFT and TD-DFT were performed to study its electronic and molecular structures. The results of these analyses elucidated the behaviors of the UV-vis and electrochemical data. Analysis of the transitions in the computed spectrum showed that the most important band is primarily composed of a HOMO→LUMO transition, designated as an intraligand (IL charge transfer.

  5. Aortic pseudoaneurysm detected on external jugular venous distention following a Bentall procedure 10 years previously.

    Science.gov (United States)

    Fukunaga, Naoto; Shomura, Yu; Nasu, Michihiro; Okada, Yukikatsu

    2010-11-01

    An asymptomatic 49-year-old woman was admitted for the purpose of surgery for aortic pseudoaneurysm. She had Marfan syndrome and had undergone an emergent Bentall procedure 10 years previously. About six months previously, she could palpate distended bilateral external jugular veins, which became distended only in a supine position and without any other symptoms. Enhanced computed tomography revealed an aortic pseudoaneurysm originating from a previous distal anastomosis site. During induction of general anesthesia in a supine position, bilateral external jugular venous distention was remarkable. Immediately after a successful operation, distention completely resolved. The present case emphasizes the importance of physical examination leading to a diagnosis of asymptomatic life-threatening diseases in patients with a history of previous aortic surgery.

  6. A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)

    Science.gov (United States)

    Rhew, Ray D.; Parker, Peter A.

    2007-01-01

    Design of Experiments (DOE) was applied to the LAS geometric parameter study to efficiently identify and rank primary contributors to integrated drag over the vehicles ascent trajectory in an order of magnitude fewer CFD configurations thereby reducing computational resources and solution time. SME s were able to gain a better understanding on the underlying flowphysics of different geometric parameter configurations through the identification of interaction effects. An interaction effect, which describes how the effect of one factor changes with respect to the levels of other factors, is often the key to product optimization. A DOE approach emphasizes a sequential approach to learning through successive experimentation to continuously build on previous knowledge. These studies represent a starting point for expanded experimental activities that will eventually cover the entire design space of the vehicle and flight trajectory.

  7. Spatial Computing and Spatial Practices

    DEFF Research Database (Denmark)

    Brodersen, Anders; Büsher, Monika; Christensen, Michael

    2007-01-01

    The gathering momentum behind the research agendas of pervasive, ubiquitous and ambient computing, set in motion by Mark Weiser (1991), offer dramatic opportunities for information systems design. They raise the possibility of "putting computation where it belongs" by exploding computing power out...... the "disappearing computer" we have, therefore, carried over from previous research an interdisciplinary perspective, and a focus on the sociality of action (Suchman 1987)....

  8. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  9. Case Studies in Library Computer Systems.

    Science.gov (United States)

    Palmer, Richard Phillips

    Twenty descriptive case studies of computer applications in a variety of libraries are presented in this book. Computerized circulation, serial and acquisition systems in public, high school, college, university and business libraries are included. Each of the studies discusses: 1) the environment in which the system operates, 2) the objectives of…

  10. The Effects of the Previous Outcome on Probabilistic Choice in Rats

    Science.gov (United States)

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2014-01-01

    This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to “gamble” after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win–stay/lose–shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models. PMID:23205915

  11. Computer processing of dynamic scintigraphic studies

    International Nuclear Information System (INIS)

    Ullmann, V.

    1985-01-01

    The methods are discussed of the computer processing of dynamic scintigraphic studies which were developed, studied or implemented by the authors within research task no. 30-02-03 in nuclear medicine within the five year plan 1981 to 85. This was mainly the method of computer processing radionuclide angiography, phase radioventriculography, regional lung ventilation, dynamic sequential scintigraphy of kidneys and radionuclide uroflowmetry. The problems are discussed of the automatic definition of fields of interest, the methodology of absolute volumes of the heart chamber in radionuclide cardiology, the design and uses are described of the multipurpose dynamic phantom of heart activity for radionuclide angiocardiography and ventriculography developed within the said research task. All methods are documented with many figures showing typical clinical (normal and pathological) and phantom measurements. (V.U.)

  12. Computer processed /sup 99m/Tc-DTPA studies of renal allotransplants

    International Nuclear Information System (INIS)

    Pavel, D.G.; Westerman, B.R.; Bergan, J.J.; Kahan, B.D.

    1976-01-01

    In order to refine the diagnostic possibilities of the radionuclide renal study in transplated patients and to compensate for the nonspecificity of the 131 I-hippuran study in some situation, /sup 99m/Tc-DTPA was used simultaneously for imaging and time-activity curves. For these curves to be significant, appropriate background subtraction had to be made with a simple computer-processing method. The results obtained have shown that it is possible to distinguish marked acute tubular necrosis from milder degrees, thus affording a prognostic index in the immediate postoperative period, when the hippuran data are often nonspecific. Further, the diagnosis and follow-up of acute rejection episodes can be improved by the DTPA processed curves. Although these curves when examined individually do not show a specific pattern for rejection, they may reveal striking evolutionary changes when compared to the previous studies, even when the hippuran curves are unchanged. The physiologic basis for the differences between the two time-activity curves may be related to the differential handling of the two radiopharmaceuticals by the kidney

  13. [A computer-aided image diagnosis and study system].

    Science.gov (United States)

    Li, Zhangyong; Xie, Zhengxiang

    2004-08-01

    The revolution in information processing, particularly the digitizing of medicine, has changed the medical study, work and management. This paper reports a method to design a system for computer-aided image diagnosis and study. Combined with some good idea of graph-text system and picture archives communicate system (PACS), the system was realized and used for "prescription through computer", "managing images" and "reading images under computer and helping the diagnosis". Also typical examples were constructed in a database and used to teach the beginners. The system was developed by the visual developing tools based on object oriented programming (OOP) and was carried into operation on the Windows 9X platform. The system possesses friendly man-machine interface.

  14. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  15. Nitrosation of melatonin by nitric oxide: a computational study.

    Science.gov (United States)

    Turjanski, A G; Sáenz, D A; Doctorovich, F; Estrin, D A; Rosenstein, R E

    2001-09-01

    Melatonin is being increasingly promoted as a therapeutic agent for the treatment of jet lag and insomnia, and is an efficient free radical scavenger. We have recently characterized a product for the reaction of melatonin with nitric oxide (NO), N-nitrosomelatonin. In the present work, reaction pathways with N1, C2, C4, C6 and C7 as possible targets for its reaction with NO that yield the respective nitroso derivatives have been investigated using semiempirical AM1 computational tools, both in vacuo and aqueous solution. Specifically, two different pathways were studied: a radical mechanism involving the hydrogen atom abstraction to yield a neutral radical followed by NO addition, and an ionic mechanism involving addition of nitrosonium ion to the indolic moiety. Our results show that the indolic nitrogen is the most probable site for nitrosation by the radical mechanism, whereas different targets are probable considering the ionic pathway. These results are in good agreement with previous experimental findings and provide a coherent picture for the interaction of melatonin with NO.

  16. Case studies in intelligent computing achievements and trends

    CERN Document Server

    Issac, Biju

    2014-01-01

    Although the field of intelligent systems has grown rapidly in recent years, there has been a need for a book that supplies a timely and accessible understanding of this important technology. Filling this need, Case Studies in Intelligent Computing: Achievements and Trends provides an up-to-date introduction to intelligent systems.This edited book captures the state of the art in intelligent computing research through case studies that examine recent developments, developmental tools, programming, and approaches related to artificial intelligence (AI). The case studies illustrate successful ma

  17. Computer Assisted Instruction in Special Education Three Case Studies

    Directory of Open Access Journals (Sweden)

    İbrahim DOĞAN

    2015-09-01

    Full Text Available The purpose of this study is to investigate the computer use of three students attending the special education center. Students have mental retardation, hearing problem and physical handicap respectively. The maximum variation sampling is used to select the type of handicap while the convenience sampling is used to select the participants. Three widely encountered handicap types in special education are chosen to select the study participants. The multiple holistic case study design is used in the study. Results of the study indicate that teachers in special education prefer to use educational games and drill and practice type of computers programs. Also it is found that over use of the animation, text and symbols cause cognitive overload on the student with mental retardation. Additionally, it is also discovered that the student with hearing problem learn words better when the computers are used in education as compared to the traditional method. Furthermore the student with physical handicap improved his fine muscle control abilities besides planned course objectives when computers are used in special education.

  18. US QCD computational performance studies with PERI

    International Nuclear Information System (INIS)

    Zhang, Y; Fowler, R; Huck, K; Malony, A; Porterfield, A; Reed, D; Shende, S; Taylor, V; Wu, X

    2007-01-01

    We report on some of the interactions between two SciDAC projects: The National Computational Infrastructure for Lattice Gauge Theory (USQCD), and the Performance Engineering Research Institute (PERI). Many modern scientific programs consistently report the need for faster computational resources to maintain global competitiveness. However, as the size and complexity of emerging high end computing (HEC) systems continue to rise, achieving good performance on such systems is becoming ever more challenging. In order to take full advantage of the resources, it is crucial to understand the characteristics of relevant scientific applications and the systems these applications are running on. Using tools developed under PERI and by other performance measurement researchers, we studied the performance of two applications, MILC and Chroma, on several high performance computing systems at DOE laboratories. In the case of Chroma, we discuss how the use of C++ and modern software engineering and programming methods are driving the evolution of performance tools

  19. Computing nucleon EDM on a lattice

    Science.gov (United States)

    Abramczyk, Michael; Aoki, Sinya; Blum, Tom; Izubuchi, Taku; Ohki, Hiroshi; Syritsyn, Sergey

    2018-03-01

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  20. Computing nucleon EDM on a lattice

    Energy Technology Data Exchange (ETDEWEB)

    Abramczyk, Michael; Izubuchi, Taku

    2017-06-18

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  1. Aberration studies and computer algebra

    International Nuclear Information System (INIS)

    Hawkes, P.W.

    1981-01-01

    The labour of calculating expressions for aberration coefficients is considerably lightened if a computer algebra language is used to perform the various substitutions and expansions involved. After a brief discussion of matrix representations of aberration coefficients, a particular language, which has shown itself to be well adapted to particle optics, is described and applied to the study of high frequency cavity lenses. (orig.)

  2. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  3. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shuo Gu

    2017-01-01

    Full Text Available With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  4. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.

    Science.gov (United States)

    Gu, Shuo; Pei, Jianfeng

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  5. Children as Educational Computer Game Designers: An Exploratory Study

    Science.gov (United States)

    Baytak, Ahmet; Land, Susan M.; Smith, Brian K.

    2011-01-01

    This study investigated how children designed computer games as artifacts that reflected their understanding of nutrition. Ten 5th grade students were asked to design computer games with the software "Game Maker" for the purpose of teaching 1st graders about nutrition. The results from the case study show that students were able to…

  6. Synthesis, characterization and computational studies of (E)-2-{[(2-aminopyridine-3-yl)imino]-methyl}-4,6-di-tert-butylphenol

    Energy Technology Data Exchange (ETDEWEB)

    Carreno, Alexander; Vega, Andres, E-mail: ichavez@uc.cl [Departamento de Ciencias Quimicas, Facultad de Ciencias Exactas, Universidad Andres Bello, Santiago (Chile); Zarate, Ximena; Schott, Eduardo [Lab. Bionanotecnologia, Departamento de Ciencias Quimico-Biologicas, Universidad Bernardo O' Higgins, Santiago (Chile); Gacitua, Manuel; Valenzuela, Ninnette; Manriquez, Juan M.; Chavez, Ivonne [Departamento de Quimica Inorganica, Facultad de Quimica, Pontificia Universidad Catolica de Chile, Santiago (Chile); Preite, Marcelo [Departamento de Quimica Inorganica, Facultad de Quimica, Pontificia Universidad Catolica de Chile, Santiago (Chile)

    2014-07-01

    (E)-2-{[(2-Aminopyridine-3-yl)imino]-methyl}-4,6-di-tert-butyl-phenol ( 3: ), a ligand containing an intramolecular hydrogen bond, was prepared according to a previous literature report, with modifications, and was characterized by UV-vis, FTIR, {sup 1}H-NMR, {sup 13}C-NMR, HHCOSY, TOCSY and cyclic voltammetry. Computational analyses at the level of DFT and TD-DFT were performed to study its electronic and molecular structures. The results of these analyses elucidated the behaviors of the UV-vis and electrochemical data. Analysis of the transitions in the computed spectrum showed that the most important band is primarily composed of a HOMO→LUMO transition, designated as an intraligand (IL) charge transfer. (author)

  7. FORMING SCHOOLCHILD’S PERSONALITY IN COMPUTER STUDY LESSONS AT PRIMARY SCHOOL

    Directory of Open Access Journals (Sweden)

    Natalia Salan

    2017-04-01

    Full Text Available The influence of computer on the formation of primary schoolchildren’s personality and their implementing into learning activity are considered in the article. Based on the materials of state standards and the Law of Ukraine on Higher Education the concepts “computer”, “information culture” are defined, modern understanding of the concept “basics of computer literacy” is identified. The main task of school propaedeutic course in Computer Studies is defined. Interactive methods of activity are singled out. They are didactic games, designing, research, collaboration in pairs, and group interaction, etc. The essential characteristics of didactic game technologies are distinguished, the peculiarities of their use at primary school in Computer Study lessons are analyzed. Positive and negative aspects of using these technologies in Computer Study lessons are defined. The expediency of using game technologies while organizing students’ educational and cognitive activity in Computer Studies is substantiated. The idea to create a school course “Computer Studies at primary school” is caused by the wide introduction of computer technics into the educational system. Today’s schoolchild has to be able to use a computer as freely and easily as he can use a pen, a pencil or a ruler. That’s why it is advisable to start studying basics of Computer Studies at the primary school age. This course is intended for the pupils of the 2nd-4th forms. Firstly, it provides mastering practical skills of computer work and, secondly, it anticipates the development of children’s logical and algorithmic thinking styles. At these lessons students acquire practical skills to work with information on the computer. Having mastered the computer skills at primary school, children will be able to use it successfully in their work. In senior classes they will be able to realize acquired knowledge of the methods of work with information, ways of problem solving

  8. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    Science.gov (United States)

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, pemotional intelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  9. Dynamic time-dependent analysis and static three-dimensional imaging procedures for computer-assisted CNS studies

    International Nuclear Information System (INIS)

    Budinger, T.F.; DeLand, F.H.; Duggan, H.E.; Bouz, J.J.; Hoop, B. Jr.; McLaughlin, W.T.; Weber, P.M.

    1975-01-01

    Two-dimensional computer image-processing techniques have not proved to be of importance in diagnostic nuclear medicine primarily because the radionuclide distribution represents a three-dimensional problem. More recent developments in three-dimensional reconstruction from multiple views or multiple detectors promise to overcome the major limitations in previous work with digital computers. These techniques are now in clinical use for static imaging; however, speed limitations have prevented application to dynamic imaging. The future development of these methods will require innovations in patient positioning and multiple-view devices for either single-gamma or positron annihilation detection

  10. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  11. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  12. Computer and machine vision theory, algorithms, practicalities

    CERN Document Server

    Davies, E R

    2012-01-01

    Computer and Machine Vision: Theory, Algorithms, Practicalities (previously entitled Machine Vision) clearly and systematically presents the basic methodology of computer and machine vision, covering the essential elements of the theory while emphasizing algorithmic and practical design constraints. This fully revised fourth edition has brought in more of the concepts and applications of computer vision, making it a very comprehensive and up-to-date tutorial text suitable for graduate students, researchers and R&D engineers working in this vibrant subject. Key features include: Practical examples and case studies give the 'ins and outs' of developing real-world vision systems, giving engineers the realities of implementing the principles in practice New chapters containing case studies on surveillance and driver assistance systems give practical methods on these cutting-edge applications in computer vision Necessary mathematics and essential theory are made approachable by careful explanations and well-il...

  13. Editorial for special section of grid computing journal on “Cloud Computing and Services Science‿

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Ivanov, Ivan I.

    This editorial briefly discusses characteristics, technology developments and challenges of cloud computing. It then introduces the papers included in the special issue on "Cloud Computing and Services Science" and positions the work reported in these papers with respect to the previously mentioned

  14. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    Science.gov (United States)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  15. Estimating the effect of current, previous and never use of drugs in studies based on prescription registries

    DEFF Research Database (Denmark)

    Nielsen, Lars Hougaard; Løkkegaard, Ellen; Andreasen, Anne Helms

    2009-01-01

    of this misclassification for analysing the risk of breast cancer. MATERIALS AND METHODS: Prescription data were obtained from Danish Registry of Medicinal Products Statistics and we applied various methods to approximate treatment episodes. We analysed the duration of HT episodes to study the ability to identify......PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do...... not carry any information on the time of discontinuation of treatment.In this study, we investigated the amount of misclassification of exposure (never, current, previous use) to hormone therapy (HT) when the exposure variable was based on prescription data. Furthermore, we evaluated the significance...

  16. Studi Perbandingan Layanan Cloud Computing

    OpenAIRE

    Afdhal, Afdhal

    2013-01-01

    In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud d...

  17. Role of heterozygous APC mutation in niche succession and initiation of colorectal cancer--a computational study.

    Directory of Open Access Journals (Sweden)

    Roschen Sasikumar

    Full Text Available Mutations in the adenomatous polyposis coli (APC gene are found in most colorectal cancers. They cause constitutive activation of proliferative pathways when both alleles of the gene are mutated. However studies on individuals with familial adenomatous polyposis (FAP have shown that a single mutated APC allele can also create changes in the precancerous colon crypt, like increased number of stem cells, increased crypt fission, greater variability of DNA methylation patterns, and higher somatic mutation rates. In this paper, using a computational model of colon crypt dynamics, we evolve and investigate a hypothesis on the effect of heterozygous APC mutation that explains these different observations. Based on previous reports and the results from the computational model we propose the hypothesis that heterozygous APC mutation has the effect of increasing the chances for a stem cell to divide symmetrically, producing two stem cell daughters. We incorporate this hypothesis into the model and perform simulation experiments to investigate the consequences of the hypothesis. Simulations show that this hypothesis links together the changes in FAP crypts observed in previous studies. The simulations also show that an APC(+/- stem cell gets selective advantages for dominating the crypt and progressing to cancer. This explains why most colon cancers are initiated by APC mutation. The results could have implications for preventing or retarding the onset of colon cancer in people with inherited or acquired mutation of one APC allele. Experimental validation of the hypothesis as well as investigation into the molecular mechanisms of this effect may therefore be worth undertaking.

  18. Does computer use affect the incidence of distal arm pain? A one-year prospective study using objective measures of computer use

    DEFF Research Database (Denmark)

    Mikkelsen, S.; Lassen, C. F.; Vilstrup, Imogen

    2012-01-01

    PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded with a soft......PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded...... with a software program installed on the participants' computers. Participants reported weekly pain scores via the software program for elbow, forearm and wrist/hand as well as in a questionnaire at baseline and 1-year follow up. Associations between pain development and computer work were examined for three pain...... were not risk factors for acute pain, nor did they modify the effects of mouse or keyboard time. Computer usage parameters were not associated with prolonged or chronic pain. A major limitation of the study was low keyboard times. CONCLUSION: Computer work was not related to the development...

  19. Comparative study on the performance of Pod type waterjet by experiment and computation

    Directory of Open Access Journals (Sweden)

    Moon-Chan Kim

    2010-03-01

    Full Text Available A comparative study between a computation and an experiment has been conducted to predict the performance of a Pod type waterjet for an amphibious wheeled vehicle. The Pod type waterjet has been chosen on the basis of the required specific speed of more than 2500. As the Pod type waterjet is an extreme type of axial flow type waterjet, theoretical as well as experimental works about Pod type waterjets are very rare. The main purpose of the present study is to validate and compare to the experimental results of the Pod type waterjet with the developed CFD in-house code based on the RANS equations. The developed code has been validated by comparing with the experimental results of the well-known turbine problem. The validation also extended to the flush type waterjet where the pressures along the duct surface and also velocities at nozzle area have been compared with experimental results. The Pod type waterjet has been designed and the performance of the designed waterjet system including duct, impeller and stator was analyzed by the previously mentioned in-house CFD Code. The pressure distributions and limiting streamlines on the blade surfaces were computed to confirm the performance of the designed waterjets. In addition, the torque and momentum were computed to find the entire efficiency and these were compared with the model test results. Measurements were taken of the flow rate at the nozzle exit, static pressure at the various sections along the duct and also the nozzle, revolution of the impeller, torque, thrust and towing forces at various advance speeds for the prediction of performance as well as for comparison with the computations. Based on these measurements, the performance was analyzed according to the ITTC96 standard analysis method. The full-scale effective and the delivered power of the wheeled vehicle were estimated for the prediction of the service speed. This paper emphasizes the confirmation of the ITTC96 analysis method and

  20. Computational study of a High Pressure Turbine Nozzle/Blade Interaction

    Science.gov (United States)

    Kopriva, James; Laskowski, Gregory; Sheikhi, Reza

    2015-11-01

    A downstream high pressure turbine blade has been designed for this study to be coupled with the upstream uncooled nozzle of Arts and Rouvroit [1992]. The computational domain is first held to a pitch-line section that includes no centrifugal forces (linear sliding-mesh). The stage geometry is intended to study the fundamental nozzle/blade interaction in a computationally cost efficient manner. Blade/Nozzle count of 2:1 is designed to maintain computational periodic boundary conditions for the coupled problem. Next the geometry is extended to a fully 3D domain with endwalls to understand the impact of secondary flow structures. A set of systematic computational studies are presented to understand the impact of turbulence on the nozzle and down-stream blade boundary layer development, resulting heat transfer, and downstream wake mixing in the absence of cooling. Doing so will provide a much better understanding of stage mixing losses and wall heat transfer which, in turn, can allow for improved engine performance. Computational studies are performed using WALE (Wale Adapted Local Eddy), IDDES (Improved Delayed Detached Eddy Simulation), SST (Shear Stress Transport) models in Fluent.

  1. A new computer-based counselling system for the promotion of physical activity in patients with chronic diseases--results from a pilot study.

    Science.gov (United States)

    Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna

    2011-05-01

    To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Computational fluid mechanics

    Science.gov (United States)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  3. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.; Yan, Lie

    2014-01-01

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  4. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.

    2014-08-29

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  5. Does computer use affect the incidence of distal arm pain? A one-year prospective study using objective measures of computer use

    DEFF Research Database (Denmark)

    Mikkelsen, Sigurd; Lassen, Christina Funch; Vilstrup, Imogen

    2012-01-01

    PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded with a soft......PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded...... with a software program installed on the participants' computers. Participants reported weekly pain scores via the software program for elbow, forearm and wrist/hand as well as in a questionnaire at baseline and 1-year follow up. Associations between pain development and computer work were examined for three pain...... were not risk factors for acute pain, nor did they modify the effects of mouse or keyboard time. Computer usage parameters were not associated with prolonged or chronic pain. A major limitation of the study was low keyboard times. CONCLUSION: Computer work was not related to the development...

  6. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study

    Science.gov (United States)

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien

    2017-01-01

    Background Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. Objective The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. Methods We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Results Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician’s ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. Conclusions AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. PMID:28951384

  7. Studies on the zeros of Bessel functions and methods for their computation: 3. Some new works on monotonicity, convexity, and other properties

    Science.gov (United States)

    Kerimov, M. K.

    2016-12-01

    This paper continues the study of real zeros of Bessel functions begun in the previous parts of this work (see M. K. Kerimov, Comput. Math. Math. Phys. 54 (9), 1337-1388 (2014); 56 (7), 1175-1208 (2016)). Some new results regarding the monotonicity, convexity, concavity, and other properties of zeros are described. Additionally, the zeros of q-Bessel functions are investigated.

  8. Using Computer Simulations in Chemistry Problem Solving

    Science.gov (United States)

    Avramiotis, Spyridon; Tsaparlis, Georgios

    2013-01-01

    This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…

  9. Symbolic-computation study of the perturbed nonlinear Schrodinger model in inhomogeneous optical fibers

    International Nuclear Information System (INIS)

    Tian Bo; Gao Yitian

    2005-01-01

    A realistic, inhomogeneous fiber in the optical communication systems can be described by the perturbed nonlinear Schrodinger model (also named as the normalized nonlinear Schrodinger model with periodically varying coefficients, dispersion managed nonlinear Schrodinger model or nonlinear Schrodinger model with variable coefficients). Hereby, we extend to this model a direct method, perform symbolic computation and obtain two families of the exact, analytic bright-solitonic solutions, with or without the chirp respectively. The parameters addressed include the shape of the bright soliton, soliton amplitude, inverse width of the soliton, chirp, frequency, center of the soliton and center of the phase of the soliton. Of optical and physical interests, we discuss some previously-published special cases of our solutions. Those solutions could help the future studies on the optical communication systems. ms

  10. NASA Computational Case Study: The Flight of Friendship 7

    Science.gov (United States)

    Simpson, David G.

    2012-01-01

    In this case study, we learn how to compute the position of an Earth-orbiting spacecraft as a function of time. As an exercise, we compute the position of John Glenn's Mercury spacecraft Friendship 7 as it orbited the Earth during the third flight of NASA's Mercury program.

  11. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  12. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  13. Using NCLab-karel to improve computational thinking skill of junior high school students

    Science.gov (United States)

    Kusnendar, J.; Prabawa, H. W.

    2018-05-01

    Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.

  14. Visibility of Different Intraorbital Foreign Bodies Using Plain Radiography, Computed Tomography, Magnetic Resonance Imaging, and Cone-Beam Computed Tomography: An In Vitro Study.

    Science.gov (United States)

    Javadrashid, Reza; Golamian, Masoud; Shahrzad, Maryam; Hajalioghli, Parisa; Shahmorady, Zahra; Fouladi, Daniel F; Sadrarhami, Shohreh; Akhoundzadeh, Leila

    2017-05-01

    The study sought to compare the usefulness of 4 imaging modalities in visualizing various intraorbital foreign bodies (IOFBs) in different sizes. Six different materials including metal, wood, plastic, stone, glass. and graphite were cut in cylindrical shapes in 4 sizes (dimensions: 0.5, 1, 2, and 3 mm) and placed intraorbitally in the extraocular space of fresh sheep's head. Four skilled radiologists rated the visibility of the objects individually using plain radiography, spiral computed tomography (CT), magnetic resonance imaging (MRI), and cone-beam computed tomography (CBCT) in accordance with a previously described grading system. Excluding wood, all embedded foreign bodies were best visualized in CT and CBCT images with almost equal accuracies. Wood could only be detected using MRI, and then only when fragments were more than 2 mm in size. There were 3 false-positive MRI reports, suggesting air bubbles as wood IOFBs. Because of lower cost and using less radiation in comparison with conventional CT, CBCT can be used as the initial imaging technique in cases with suspected IOFBs. Optimal imaging technique for wood IOFBs is yet to be defined. Copyright © 2016 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  15. Implant breast reconstruction after salvage mastectomy in previously irradiated patients.

    Science.gov (United States)

    Persichetti, Paolo; Cagli, Barbara; Simone, Pierfranco; Cogliandro, Annalisa; Fortunato, Lucio; Altomare, Vittorio; Trodella, Lucio

    2009-04-01

    The most common surgical approach in case of local tumor recurrence after quadrantectomy and radiotherapy is salvage mastectomy. Breast reconstruction is the subsequent phase of the treatment and the plastic surgeon has to operate on previously irradiated and manipulated tissues. The medical literature highlights that breast reconstruction with tissue expanders is not a pursuable option, considering previous radiotherapy a contraindication. The purpose of this retrospective study is to evaluate the influence of previous radiotherapy on 2-stage breast reconstruction (tissue expander/implant). Only patients with analogous timing of radiation therapy and the same demolitive and reconstructive procedures were recruited. The results of this study prove that, after salvage mastectomy in previously irradiated patients, implant reconstruction is still possible. Further comparative studies are, of course, advisable to draw any conclusion on the possibility to perform implant reconstruction in previously irradiated patients.

  16. Computer mapping as an aid in air-pollution studies: Montreal region study

    Energy Technology Data Exchange (ETDEWEB)

    Granger, J M

    1972-01-01

    Through the use of computer-mapping programs, an operational technique has been designed which allows an almost-instant appraisal of the intensity of atmospheric pollution in an urban region on the basis of epiphytic sensitivity. The epiphytes considered are essentially lichens and mosses growing on trees. This study was applied to the Montreal region, with 349 samplings statiions distributed nearly uniformly. Computer graphics of the findings are included in the appendix.

  17. Acquiring skills in malignant hyperthermia crisis management: comparison of high-fidelity simulation versus computer-based case study

    Directory of Open Access Journals (Sweden)

    Vilma Mejía

    Full Text Available Abstract Introduction: The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. Methods: After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. Results: 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025, prioritization of initial actions of management (p = 0.003, recognize complications (p = 0.025 and communication (p = 0.025. Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032. Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Conclusion: Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents.

  18. Penerapan Teknologi Cloud Computing Di Universitas Studi Kasus: Fakultas Teknologi Informasi Ukdw

    OpenAIRE

    Kurniawan, Erick

    2015-01-01

    Teknologi Cloud Computing adalah paradigma baru dalam penyampaian layanan komputasi. Cloud Computing memiliki banyak kelebihan dibandingkan dengan sistem konvensional. Artikel ini membahas tentang arsitektur cloud computing secara umum dan beberapa contoh penerapan layanan cloud computing beserta manfaatnya di lingkungan universitas. Studi kasus yang diambil adalah penerapan layanan cloud computing di Fakultas Teknologi Informasi UKDW.

  19. Longitudinal patterns of problematic computer game use among adolescents and adults--a 2-year panel study.

    Science.gov (United States)

    Scharkow, Michael; Festl, Ruth; Quandt, Thorsten

    2014-11-01

    To investigate the longitudinal patterns (stability and change) of problematic computer game use and its interdependencies with psychosocial wellbeing in different age groups. Three-wave, annual panel study using computer-assisted telephone surveys. Germany. A total of 112 adolescents aged between 14 and 18 years, 363 younger adults between 19-39 years and 427 adults aged 40 years and older (overall n = 902). Problematic game use was measured with the Gaming Addiction Short Scale (GAS), which covers seven criteria including salience, withdrawal and conflict. Additionally, gaming behaviour and psychosocial wellbeing (social capital and support, life satisfaction and success) were measured in all three panel waves. The generally low GAS scores were very stable in yearly intervals [average autocorrelation across waves and age groups: r = 0.74, confidence interval (CI) = 0.71, 0.77]. Only nine respondents (1%, CI = 0.5, 1.9) consistently exhibited symptoms of problematic game use across all waves, while no respondent could be classified consistently as being addicted according to the GAS criteria. Changes in problematic gaming were not related consistently to changes in psychosocial wellbeing, although some cross-lagged effects were statistically significant in younger and older adult groups. Within a 2-year time-frame, problematic use of computer games appears to be a less stable behaviour than reported previously and not related systematically to negative changes in the gamers' lives. © 2014 Society for the Study of Addiction.

  20. Revisiting dibenzothiophene thermochemical data: Experimental and computational studies

    International Nuclear Information System (INIS)

    Freitas, Vera L.S.; Gomes, Jose R.B.; Ribeiro da Silva, Maria D.M.C.

    2009-01-01

    Thermochemical data of dibenzothiophene were studied in the present work by experimental techniques and computational calculations. The standard (p 0 =0.1MPa) molar enthalpy of formation, at T = 298.15 K, in the gaseous phase, was determined from the enthalpy of combustion and sublimation, obtained by rotating bomb calorimetry in oxygen, and by Calvet microcalorimetry, respectively. This value was compared with estimated data from G3(MP2)//B3LYP computations and also with the other results available in the literature.

  1. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  2. An Experimental Study into the use of computers for teaching of ...

    African Journals Online (AJOL)

    This study was an experimental study which sought to establish how English language teachers used computers for teaching composition writing at Prince Edward High School in Harare. The findings of the study show that computers were rarely used in the teaching of composition despite the observation that the school ...

  3. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study.

    Science.gov (United States)

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien; Hwang, Juey-Jen; Ho, Yi-Lwun

    2017-09-26

    Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician's ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. ©Ying-Hsien Chen, Chi-Sheng Hung, Ching-Chang Huang, Yu-Chien Hung, Juey-Jen Hwang, Yi-Lwun Ho. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 26.09.2017.

  4. Case-control study for colorectal cancer genetic susceptibility in EPICOLON: previously identified variants and mucins

    Directory of Open Access Journals (Sweden)

    Moreno Victor

    2011-08-01

    Full Text Available Abstract Background Colorectal cancer (CRC is the second leading cause of cancer death in developed countries. Familial aggregation in CRC is also important outside syndromic forms and, in this case, a polygenic model with several common low-penetrance alleles contributing to CRC genetic predisposition could be hypothesized. Mucins and GALNTs (N-acetylgalactosaminyltransferase are interesting candidates for CRC genetic susceptibility and have not been previously evaluated. We present results for ten genetic variants linked to CRC risk in previous studies (previously identified category and 18 selected variants from the mucin gene family in a case-control association study from the Spanish EPICOLON consortium. Methods CRC cases and matched controls were from EPICOLON, a prospective, multicenter, nationwide Spanish initiative, comprised of two independent stages. Stage 1 corresponded to 515 CRC cases and 515 controls, whereas stage 2 consisted of 901 CRC cases and 909 controls. Also, an independent cohort of 549 CRC cases and 599 controls outside EPICOLON was available for additional replication. Genotyping was performed for ten previously identified SNPs in ADH1C, APC, CCDN1, IL6, IL8, IRS1, MTHFR, PPARG, VDR and ARL11, and 18 selected variants in the mucin gene family. Results None of the 28 SNPs analyzed in our study was found to be associated with CRC risk. Although four SNPs were significant with a P-value ADH1C (OR = 1.63, 95% CI = 1.06-2.50, P-value = 0.02, recessive, rs1800795 in IL6 (OR = 1.62, 95% CI = 1.10-2.37, P-value = 0.01, recessive, rs3803185 in ARL11 (OR = 1.58, 95% CI = 1.17-2.15, P-value = 0.007, codominant, and rs2102302 in GALNTL2 (OR = 1.20, 95% CI = 1.00-1.44, P-value = 0.04, log-additive 0, 1, 2 alleles], only rs3803185 achieved statistical significance in EPICOLON stage 2 (OR = 1.34, 95% CI = 1.06-1.69, P-value = 0.01, recessive. In the joint analysis for both stages, results were only significant for rs3803185 (OR = 1

  5. Case-control study for colorectal cancer genetic susceptibility in EPICOLON: previously identified variants and mucins

    International Nuclear Information System (INIS)

    Abulí, Anna; Morillas, Juan D; Rigau, Joaquim; Latorre, Mercedes; Fernández-Bañares, Fernando; Peña, Elena; Riestra, Sabino; Payá, Artemio; Jover, Rodrigo; Xicola, Rosa M; Llor, Xavier; Fernández-Rozadilla, Ceres; Carvajal-Carmona, Luis; Villanueva, Cristina M; Moreno, Victor; Piqué, Josep M; Carracedo, Angel; Castells, Antoni; Andreu, Montserrat; Ruiz-Ponte, Clara; Castellví-Bel, Sergi; Alonso-Espinaco, Virginia; Muñoz, Jenifer; Gonzalo, Victoria; Bessa, Xavier; González, Dolors; Clofent, Joan; Cubiella, Joaquin

    2011-01-01

    Colorectal cancer (CRC) is the second leading cause of cancer death in developed countries. Familial aggregation in CRC is also important outside syndromic forms and, in this case, a polygenic model with several common low-penetrance alleles contributing to CRC genetic predisposition could be hypothesized. Mucins and GALNTs (N-acetylgalactosaminyltransferase) are interesting candidates for CRC genetic susceptibility and have not been previously evaluated. We present results for ten genetic variants linked to CRC risk in previous studies (previously identified category) and 18 selected variants from the mucin gene family in a case-control association study from the Spanish EPICOLON consortium. CRC cases and matched controls were from EPICOLON, a prospective, multicenter, nationwide Spanish initiative, comprised of two independent stages. Stage 1 corresponded to 515 CRC cases and 515 controls, whereas stage 2 consisted of 901 CRC cases and 909 controls. Also, an independent cohort of 549 CRC cases and 599 controls outside EPICOLON was available for additional replication. Genotyping was performed for ten previously identified SNPs in ADH1C, APC, CCDN1, IL6, IL8, IRS1, MTHFR, PPARG, VDR and ARL11, and 18 selected variants in the mucin gene family. None of the 28 SNPs analyzed in our study was found to be associated with CRC risk. Although four SNPs were significant with a P-value < 0.05 in EPICOLON stage 1 [rs698 in ADH1C (OR = 1.63, 95% CI = 1.06-2.50, P-value = 0.02, recessive), rs1800795 in IL6 (OR = 1.62, 95% CI = 1.10-2.37, P-value = 0.01, recessive), rs3803185 in ARL11 (OR = 1.58, 95% CI = 1.17-2.15, P-value = 0.007, codominant), and rs2102302 in GALNTL2 (OR = 1.20, 95% CI = 1.00-1.44, P-value = 0.04, log-additive 0, 1, 2 alleles], only rs3803185 achieved statistical significance in EPICOLON stage 2 (OR = 1.34, 95% CI = 1.06-1.69, P-value = 0.01, recessive). In the joint analysis for both stages, results were only significant for rs3803185 (OR = 1.12, 95% CI = 1

  6. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  7. Using the genome aggregation database, computational pathogenicity prediction tools, and patch clamp heterologous expression studies to demote previously published long QT syndrome type 1 mutations from pathogenic to benign.

    Science.gov (United States)

    Clemens, Daniel J; Lentino, Anne R; Kapplinger, Jamie D; Ye, Dan; Zhou, Wei; Tester, David J; Ackerman, Michael J

    2018-04-01

    Mutations in the KCNQ1-encoded Kv7.1 potassium channel cause long QT syndrome (LQTS) type 1 (LQT1). It has been suggested that ∼10%-20% of rare LQTS case-derived variants in the literature may have been published erroneously as LQT1-causative mutations and may be "false positives." The purpose of this study was to determine which previously published KCNQ1 case variants are likely false positives. A list of all published, case-derived KCNQ1 missense variants (MVs) was compiled. The occurrence of each MV within the Genome Aggregation Database (gnomAD) was assessed. Eight in silico tools were used to predict each variant's pathogenicity. Case-derived variants that were either (1) too frequently found in gnomAD or (2) absent in gnomAD but predicted to be pathogenic by ≤2 tools were considered potential false positives. Three of these variants were characterized functionally using whole-cell patch clamp technique. Overall, there were 244 KCNQ1 case-derived MVs. Of these, 29 (12%) were seen in ≥10 individuals in gnomAD and are demotable. However, 157 of 244 MVs (64%) were absent in gnomAD. Of these, 7 (4%) were predicted to be pathogenic by ≤2 tools, 3 of which we characterized functionally. There was no significant difference in current density between heterozygous KCNQ1-F127L, -P477L, or -L619M variant-containing channels compared to KCNQ1-WT. This study offers preliminary evidence for the demotion of 32 (13%) previously published LQT1 MVs. Of these, 29 were demoted because of their frequent sighting in gnomAD. Additionally, in silico analysis and in vitro functional studies have facilitated the demotion of 3 ultra-rare MVs (F127L, P477L, L619M). Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  8. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  9. A Computational and Theoretical Study of Conductance in Hydrogen-bonded Molecular Junctions

    Science.gov (United States)

    Wimmer, Michael

    This thesis is devoted to the theoretical and computational study of electron transport in molecular junctions where one or more hydrogen bonds are involved in the process. While electron transport through covalent bonds has been extensively studied, in recent work the focus has been shifted towards hydrogen-bonded systems due to their ubiquitous presence in biological systems and their potential in forming nano-junctions between molecular electronic devices and biological systems. This analysis allows us to significantly expand our comprehension of the experimentally observed result that the inclusion of hydrogen bonding in a molecular junction significantly impacts its transport properties, a fact that has important implications for our understanding of transport through DNA, and nano-biological interfaces in general. In part of this work I have explored the implications of quasiresonant transport in short chains of weakly-bonded molecular junctions involving hydrogen bonds. I used theoretical and computational analysis to interpret recent experiments and explain the role of Fano resonances in the transmission properties of the junction. In a different direction, I have undertaken the study of the transversal conduction through nucleotide chains that involve a variable number of different hydrogen bonds, e.g. NH˙˙˙O, OH˙˙˙O, and NH˙˙˙N, which are the three most prevalent hydrogen bonds in biological systems and organic electronics. My effort here has focused on the analysis of electronic descriptors that allow a simplified conceptual and computational understanding of transport properties. Specifically, I have expanded our previous work where the molecular polarizability was used as a conductance descriptor to include the possibility of atomic and bond partitions of the molecular polarizability. This is important because it affords an alternative molecular description of conductance that is not based on the conventional view of molecular orbitals as

  10. A Comparative Computational Fluid Dynamics Study on an Innovative Exhaust Air Energy Recovery Wind Turbine Generator

    Directory of Open Access Journals (Sweden)

    Seyedsaeed Tabatabaeikia

    2016-05-01

    Full Text Available Recovering energy from exhaust air systems of building cooling towers is an innovative idea. A specific wind turbine generator was designed in order to achieve this goal. This device consists of two Giromill vertical axis wind turbines (VAWT combined with four guide vanes and two diffuser plates. It was clear from previous literatures that no comprehensive flow behavior study had been carried out on this innovative device. Therefore, the working principle of this design was simulated using the Analysis System (ANSYS Fluent computational fluid dynamics (CFD package and the results were compared to experimental ones. It was perceived from the results that by introducing the diffusers and then the guide vanes, the overall power output of the wind turbine was improved by approximately 5% and 34%, respectively, compared to using VAWT alone. In the case of the diffusers, the optimum angle was found to be 7°, while for guide vanes A and B, it was 70° and 60° respectively. These results were in good agreement with experimental results obtained in the previous experimental study. Overall, it can be concluded that exhaust air recovery turbines are a promising form of green technology.

  11. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  12. Study Of Visual Disorders In Egyptian Computer Operators

    International Nuclear Information System (INIS)

    Al-Awadi, M.Y.; Awad Allah, H.; Hegazy, M. T.; Naguib, N.; Akmal, M.

    2012-01-01

    The aim of the study was to evaluate the probable effects of exposure to electromagnetic waves radiated from visual display terminals on some of visual functions. 300 computer operators working in different institutes were selected randomly. They were asked to fill a pre-tested questionnaire (written in Arabic) after obtaining their verbal consent. Among them, one hundred fifty exposed to visual display terminals were selected for the clinical study (group I). The control group includes one hundred fifty participants (their age matched with group I) but working in a field that did not expose to visual display terminals (group II). All chosen individuals were not suffering from any apparent health problems or any apparent diseases that could affect their visual conditions. All exposed candidates were using a VDT of LCD type size 15 and 17 and larger. Data entry and analysis were done using the SPSS version 17.0 applying appropriate statistical methods. The results showed that among the 150 exposed studied subjects, high significant occurrence of dryness and high significant association between occurrence of asthenopia and background variables (working hours using computers) were observed. Exposed subjects showed that 92% complained of tired eyes and eye strain, 37.33% complained of dry or sore eyes, 68% complained of headache, 68% complained of blurred distant vision 45.33% complained of asthenopia and 89.33% complained of neck, shoulder and back aches. Meantime, the control group showed that 18% complained of tired eyes, 21.33% of dry eyes and 12.67% of neck, shoulder and back aches. It could be concluded that prevalence of computer vision syndrome was noted to be quite high among computer operators.

  13. Computational geometry algorithms and applications

    CERN Document Server

    de Berg, Mark; Overmars, Mark; Schwarzkopf, Otfried

    1997-01-01

    Computational geometry emerged from the field of algorithms design and anal­ ysis in the late 1970s. It has grown into a recognized discipline with its own journals, conferences, and a large community of active researchers. The suc­ cess of the field as a research discipline can on the one hand be explained from the beauty of the problems studied and the solutions obtained, and, on the other hand, by the many application domains--computer graphics, geographic in­ formation systems (GIS), robotics, and others-in which geometric algorithms play a fundamental role. For many geometric problems the early algorithmic solutions were either slow or difficult to understand and implement. In recent years a number of new algorithmic techniques have been developed that improved and simplified many of the previous approaches. In this textbook we have tried to make these modem algorithmic solutions accessible to a large audience. The book has been written as a textbook for a course in computational geometry, but it can ...

  14. Developing and validating an instrument for measuring mobile computing self-efficacy.

    Science.gov (United States)

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  15. [Acquiring skills in malignant hyperthermia crisis management: comparison of high-fidelity simulation versus computer-based case study].

    Science.gov (United States)

    Mejía, Vilma; Gonzalez, Carlos; Delfino, Alejandro E; Altermatt, Fernando R; Corvetto, Marcia A

    The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025), prioritization of initial actions of management (p = 0.003), recognize complications (p = 0.025) and communication (p = 0.025). Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032). Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights

  16. Older Adults Perceptions of Technology and Barriers to Interacting with Tablet Computers: A Focus Group Study.

    Science.gov (United States)

    Vaportzis, Eleftheria; Clausen, Maria Giatsi; Gow, Alan J

    2017-10-04

    New technologies provide opportunities for the delivery of broad, flexible interventions with older adults. Focus groups were conducted to: (1) understand older adults' familiarity with, and barriers to, interacting with new technologies and tablets; and (2) utilize user-engagement in refining an intervention protocol. Eighteen older adults (65-76 years old; 83.3% female) who were novice tablet users participated in discussions about their perceptions of and barriers to interacting with tablets. We conducted three separate focus groups and used a generic qualitative design applying thematic analysis to analyse the data. The focus groups explored attitudes toward tablets and technology in general. We also explored the perceived advantages and disadvantages of using tablets, familiarity with, and barriers to interacting with tablets. In two of the focus groups, participants had previous computing experience (e.g., desktop), while in the other, participants had no previous computing experience. None of the participants had any previous experience with tablet computers. The themes that emerged were related to barriers (i.e., lack of instructions and guidance, lack of knowledge and confidence, health-related barriers, cost); disadvantages and concerns (i.e., too much and too complex technology, feelings of inadequacy, and comparison with younger generations, lack of social interaction and communication, negative features of tablets); advantages (i.e., positive features of tablets, accessing information, willingness to adopt technology); and skepticism about using tablets and technology in general. After brief exposure to tablets, participants emphasized the likelihood of using a tablet in the future. Our findings suggest that most of our participants were eager to adopt new technology and willing to learn using a tablet. However, they voiced apprehension about lack of, or lack of clarity in, instructions and support. Understanding older adults' perceptions of technology

  17. Student Study Choices in the Principles of Economics: A Case Study of Computer Usage

    OpenAIRE

    Grimes, Paul W.; Sanderson, Patricia L.; Ching, Geok H.

    1996-01-01

    Principles of Economics students at Mississippi State University were provided the opportunity to use computer assisted instruction (CAI) as a supplemental study activity. Students were free to choose the extent of their computer work. Throughout the course, weekly surveys were conducted to monitor the time each student spent with their textbook, computerized tutorials, workbook, class notes, and study groups. The surveys indicated that only a minority of the students actively pursued CAI....

  18. Computed Tomography Study Of Complicated Bacterial Meningitis ...

    African Journals Online (AJOL)

    To monitor the structural intracranial complications of bacterial meningitis using computed tomography (CT) scan. Retrospective study of medical and radiological records of patients who underwent CT scan over a 4 year period. AUniversityTeachingHospital in a developing country. Thirty three patients with clinically and ...

  19. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    ... previously deemed intractable. Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either...

  20. Analysing the doctor_patient_computer relationship: the use of video data

    Directory of Open Access Journals (Sweden)

    Christopher Pearce

    2006-12-01

    Full Text Available This paper examines the utility of using digital video data in observational studies involving doctors' and patients' use of computers in the consultation. Previous observational studies have used either direct observations or analogue videotapes. We describe a method currently in use in a study examining how doctors, patients and computers interact in the consultation. The study is set in general practice as this is the most clinically computerised section of the Australian healthcare system. Computers are now used for clinical functions in 90% of doctors' surgeries. With this rapid rise of computerisation, concerns have been expressed as to how the computer will affect the doctor_patient relationship. To assess how doctors, patients and computers interact, we have chosen an observational technique, namely to make digital videotapes of actual consultations. This analysis is based on a theoretical framework derived from dramaturgical analysis. Data are gathered from general practitioners who are high-level users of computers, as defined by their use of progress notes, as well as prescribing and test ordering. The subsequent digital data is then transferred onto computer and analysed according to our conceptual framework, making use of video-tagging software.

  1. Brain-computer interfacing under distraction: an evaluation study

    DEFF Research Database (Denmark)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes

    2016-01-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach...

  2. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  3. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  4. Consolidation of cloud computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall; Giordano, Domenico

    2017-01-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in resp...

  5. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  6. Using Computational and Mechanical Models to Study Animal Locomotion

    OpenAIRE

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locom...

  7. Factors affecting the adoption of cloud computing: an exploratory study

    OpenAIRE

    Morgan, Lorraine; Conboy, Kieran

    2013-01-01

    peer-reviewed While it is widely acknowledged that cloud computing has the potential to transform a large part of the IT industry, issues surrounding the adoption of cloud computing have received relatively little attention. Drawing on three case studies of service providers and their customers, this study will contribute to the existing cloud technologies literature that does not address the complex and multifaceted nature of adoption. The findings are analyzed using the adoption of innov...

  8. Decoding Computer Games: Studying “Special Operation 85”

    Directory of Open Access Journals (Sweden)

    Bahareh Jalalzadeh

    2009-11-01

    Full Text Available As other media, computer games convey messages which have tow features: explicit and implicit. Semiologically studying computer games and comparing them with narrative structures, the present study attempts to discover the messages they convey. Therefore we have studied and decoded “Special operation 85” as a semiological text. Results show that the game’s features, as naming, interests and motivations of the engaged people, and the events narrated, all lead the producers to their goals of introducing and publicizing Iranian-Islamic cultural values. Although this feature makes “Special Opreation 85” a unique game, it fails in its attempt to produce a mythical personage in Iranian-Islamic cultural context.

  9. Electromagnetic computation methods for lightning surge protection studies

    CERN Document Server

    Baba, Yoshihiro

    2016-01-01

    This book is the first to consolidate current research and to examine the theories of electromagnetic computation methods in relation to lightning surge protection. The authors introduce and compare existing electromagnetic computation methods such as the method of moments (MOM), the partial element equivalent circuit (PEEC), the finite element method (FEM), the transmission-line modeling (TLM) method, and the finite-difference time-domain (FDTD) method. The application of FDTD method to lightning protection studies is a topic that has matured through many practical applications in the past decade, and the authors explain the derivation of Maxwell's equations required by the FDTD, and modeling of various electrical components needed in computing lightning electromagnetic fields and surges with the FDTD method. The book describes the application of FDTD method to current and emerging problems of lightning surge protection of continuously more complex installations, particularly in critical infrastructures of e...

  10. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    Science.gov (United States)

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  11. Is Cup Positioning Challenged in Hips Previously Treated With Periacetabular Osteotomy?

    DEFF Research Database (Denmark)

    Hartig-Andreasen, Charlotte; Stilling, Maiken; Søballe, Kjeld

    2014-01-01

    After periacetabular osteotomy (PAO), some patients develop osteoarthritis with need of a total hip arthroplasty (THA). We evaluated the outcome of THA following PAO and explored factors associated with inferior cup position and increased polyethylene wear. Follow-up were performed 4 to 10years...... after THA in 34 patients (38 hips) with previous PAO. Computer analysis evaluated cup position and wear rates. No patient had dislocations or revision surgery. Median scores were: Harris hip 96, Oxford hip 38 and WOMAC 78. Mean cup anteversion and abduction angles were 22(o) (range 7°-43°) and 45......° (range 28°-65°). Outliers of cup abduction were associated with persisting dysplasia (CE...

  12. Polychlorinated biphenyl exposure, diabetes and endogenous hormones: a cross-sectional study in men previously employed at a capacitor manufacturing plant.

    Science.gov (United States)

    Persky, Victoria; Piorkowski, Julie; Turyk, Mary; Freels, Sally; Chatterton, Robert; Dimos, John; Bradlow, H Leon; Chary, Lin Kaatz; Burse, Virlyn; Unterman, Terry; Sepkovic, Daniel W; McCann, Kenneth

    2012-08-29

    Studies have shown associations of diabetes and endogenous hormones with exposure to a wide variety of organochlorines. We have previously reported positive associations of polychlorinated biphenyls (PCBs) and inverse associations of selected steroid hormones with diabetes in postmenopausal women previously employed in a capacitor manufacturing plant. This paper examines associations of PCBs with diabetes and endogenous hormones in 63 men previously employed at the same plant who in 1996 underwent surveys of their exposure and medical history and collection of bloods and urine for measurements of PCBs, lipids, liver function, hematologic markers and endogenous hormones. PCB exposure was positively associated with diabetes and age and inversely associated with thyroid stimulating hormone and triiodothyronine-uptake. History of diabetes was significantly related to total PCBs and all PCB functional groupings, but not to quarters worked and job score, after control for potential confounders. None of the exposures were related to insulin resistance (HOMA-IR) in non-diabetic men. Associations of PCBs with specific endogenous hormones differ in some respects from previous findings in postmenopausal women employed at the capacitor plant. Results from this study, however, do confirm previous reports relating PCB exposure to diabetes and suggest that these associations are not mediated by measured endogenous hormones.

  13. Defragging Computer/Videogame Implementation and Assessment in the Social Studies

    Science.gov (United States)

    McBride, Holly

    2014-01-01

    Students in this post-industrial technological age require opportunities for the acquisition of new skills, especially in the marketplace of innovation. A pedagogical strategy that is becoming more and more popular within social studies classrooms is the use of computer and video games as enhancements to everyday lesson plans. Computer/video games…

  14. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  15. Computer codes for RF cavity design

    International Nuclear Information System (INIS)

    Ko, K.

    1992-08-01

    In RF cavity design, numerical modeling is assuming an increasingly important role with the help of sophisticated computer codes and powerful yet affordable computers. A description of the cavity codes in use in the accelerator community has been given previously. The present paper will address the latest developments and discuss their applications to cavity toning and matching problems

  16. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  17. Prospective pilot study of a tablet computer in an Emergency Department.

    Science.gov (United States)

    Horng, Steven; Goss, Foster R; Chen, Richard S; Nathanson, Larry A

    2012-05-01

    The recent availability of low-cost tablet computers can facilitate bedside information retrieval by clinicians. To evaluate the effect of physician tablet use in the Emergency Department. Prospective cohort study comparing physician workstation usage with and without a tablet. 55,000 visits/year Level 1 Emergency Department at a tertiary academic teaching hospital. 13 emergency physicians (7 Attendings, 4 EM3s, and 2 EM1s) worked a total of 168 scheduled shifts (130 without and 38 with tablets) during the study period. Physician use of a tablet computer while delivering direct patient care in the Emergency Department. The primary outcome measure was the time spent using the Emergency Department Information System (EDIS) at a computer workstation per shift. The secondary outcome measure was the number of EDIS logins at a computer workstation per shift. Clinician use of a tablet was associated with a 38min (17-59) decrease in time spent per shift using the EDIS at a computer workstation (pcomputer was associated with a reduction in the number of times physicians logged into a computer workstation and a reduction in the amount of time they spent there using the EDIS. The presumed benefit is that decreasing time at a computer workstation increases physician availability at the bedside. However, this association will require further investigation. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  19. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    Science.gov (United States)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  20. Comparative study of auxetic geometries by means of computer-aided design and engineering

    International Nuclear Information System (INIS)

    Álvarez Elipe, Juan Carlos; Díaz Lantada, Andrés

    2012-01-01

    Auxetic materials (or metamaterials) are those with a negative Poisson ratio (NPR) and display the unexpected property of lateral expansion when stretched, as well as an equal and opposing densification when compressed. Such geometries are being progressively employed in the development of novel products, especially in the fields of intelligent expandable actuators, shape morphing structures and minimally invasive implantable devices. Although several auxetic and potentially auxetic geometries have been summarized in previous reviews and research, precise information regarding relevant properties for design tasks is not always provided. In this study we present a comparative study of two-dimensional and three-dimensional auxetic geometries carried out by means of computer-aided design and engineering tools (from now on CAD–CAE). The first part of the study is focused on the development of a CAD library of auxetics. Once the library is developed we simulate the behavior of the different auxetic geometries and elaborate a systematic comparison, considering relevant properties of these geometries, such as Poisson ratio(s), maximum volume or area reductions attainable and equivalent Young’s modulus, hoping it may provide useful information for future designs of devices based on these interesting structures. (paper)

  1. A computational study of high entropy alloys

    Science.gov (United States)

    Wang, Yang; Gao, Michael; Widom, Michael; Hawk, Jeff

    2013-03-01

    As a new class of advanced materials, high-entropy alloys (HEAs) exhibit a wide variety of excellent materials properties, including high strength, reasonable ductility with appreciable work-hardening, corrosion and oxidation resistance, wear resistance, and outstanding diffusion-barrier performance, especially at elevated and high temperatures. In this talk, we will explain our computational approach to the study of HEAs that employs the Korringa-Kohn-Rostoker coherent potential approximation (KKR-CPA) method. The KKR-CPA method uses Green's function technique within the framework of multiple scattering theory and is uniquely designed for the theoretical investigation of random alloys from the first principles. The application of the KKR-CPA method will be discussed as it pertains to the study of structural and mechanical properties of HEAs. In particular, computational results will be presented for AlxCoCrCuFeNi (x = 0, 0.3, 0.5, 0.8, 1.0, 1.3, 2.0, 2.8, and 3.0), and these results will be compared with experimental information from the literature.

  2. A comparative study: use of a Brain-computer Interface (BCI) device by people with cerebral palsy in interaction with computers.

    Science.gov (United States)

    Heidrich, Regina O; Jensen, Emely; Rebelo, Francisco; Oliveira, Tiago

    2015-01-01

    This article presents a comparative study among people with cerebral palsy and healthy controls, of various ages, using a Brain-computer Interface (BCI) device. The research is qualitative in its approach. Researchers worked with Observational Case Studies. People with cerebral palsy and healthy controls were evaluated in Portugal and in Brazil. The study aimed to develop a study for product evaluation in order to perceive whether people with cerebral palsy could interact with the computer and compare whether their performance is similar to that of healthy controls when using the Brain-computer Interface. Ultimately, it was found that there are no significant differences between people with cerebral palsy in the two countries, as well as between populations without cerebral palsy (healthy controls).

  3. Characteristic findings of computed tomography in cerebral metastatic malignant melanomas

    International Nuclear Information System (INIS)

    Kukita, Chikashige; Nose, Tadao; Nakagawa, Kunio; Tomono, Yuji; Enomoto, Takao; Hashikawa, Masanori; Egashira, Taihei; Maki, Yutaka

    1986-01-01

    Four cases with metastatic cerebral melanoma were studied by means of computed tomography (CT). Two cases were male, and the other two were female, with an average age of 55 years. Their primary lesions were on the chest wall in two cases, around the calcaneus in one, and around the genitalia in one. All cases died within 6 months after the metastatic brain lesions were found. Necropsies were carried out in two cases. CT revealed high-density areas in all cases, and contrast studies showed an enhancement of the lesions, as has previously been reported. On the other hand, autopsied cases revealed neither fresh nor old intratumoral bleedings such as a scattered focus of hemosiderin. These findings suggest that the high-density tumoral shadows in CT are probably not intratumoral bleedings due to a bleeding tendency of the tumors, as some authors have previously supposed. We mentioned some other factors contributing to the high density of the melanoma on computed tomograms. (author)

  4. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems.

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach's Youth Self-Report (YSR). The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students' place of living and their parents' job, and using computer games. Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents.

  5. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  6. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in modern High Energy Physics. To perform precision measurements of the Higgs boson properties, the use of fast and efficient instruments of Monte Carlo event simulation is required. Due to the increasing amount of data and to the growing complexity of the simulation software tools, the computing resources currently available for Monte Carlo simulation on the LHC GRID are not sufficient. One of the possibilities to address this shortfall of computing resources is the usage of institutes computer clusters, commercial computing resources and supercomputers. In this paper, a brief description of the Higgs boson physics, the Monte-Carlo generation and event simulation techniques are presented. A description of modern high performance computing systems and tests of their performance are also discussed. These studies have been performed on the Worldwide LHC Computing Grid and Kurchatov Institute Data Processing Center, including Tier...

  7. On-the-Fly Computation of Bisimilarity Distances

    DEFF Research Database (Denmark)

    Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim Guldstrand

    2017-01-01

    of Desharnais et al. between discrete-time Markov chains as an optimal solution of a linear program that can be solved by using the ellipsoid method. Inspired by their result, we propose a novel linear program characterization to compute the distance in the continuous-time setting. Differently from previous......We propose a distance between continuous-time Markov chains (CTMCs) and study the problem of computing it by comparing three different algorithmic methodologies: iterative, linear program, and on-the-fly. In a work presented at FoSSaCS'12, Chen et al. characterized the bisimilarity distance...... proposals, ours has a number of constraints that is bounded by a polynomial in the size of the CTMC. This, in particular, proves that the distance we propose can be computed in polynomial time. Despite its theoretical importance, the proposed linear program characterization turns out to be inefficient...

  8. An Exploratory Study of Pauses in Computer-Assisted EFL Writing

    Science.gov (United States)

    Xu, Cuiqin; Ding, Yanren

    2014-01-01

    The advance of computer input log and screen-recording programs over the last two decades has greatly facilitated research into the writing process in real time. Using Inputlog 4.0 and Camtasia 6.0 to record the writing process of 24 Chinese EFL writers in an argumentative task, this study explored L2 writers' pausing patterns in computer-assisted…

  9. Musculoskeletal Problems Associated with University Students Computer Users: A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Rakhadani PB

    2017-07-01

    Full Text Available While several studies have examined the prevalence and correlates of musculoskeletal problems among university students, scanty information exists in South African context. The objective of this study was to determine the prevalence, causes and consequences of musculoskeletal problems among University of Venda students’ computer users. This cross-sectional study involved 694 university students at the University of Venda. A self-designed questionnaire was used to collect information on the sociodemographic characteristics, problems associated with computer users, and causes of musculoskeletal problems associated with computer users. The majority (84.6% of the participants use computer for internet, wording processing (20.3%, and games (18.7%. The students reported neck pain when using computer (52.3%; shoulder (47.0%, finger (45.0%, lower back (43.1%, general body pain (42.9%, elbow (36.2%, wrist (33.7%, hip and foot (29.1% and knee (26.2%. Reported causes of musculoskeletal pains associated with computer usage were: sitting position, low chair, a lot of time spent on computer, uncomfortable laboratory chairs, and stressfulness. Eye problems (51.9%, muscle cramp (344.0%, headache (45.3%, blurred vision (38.0%, feeling of illness (39.9% and missed lectures (29.1% were consequences of musculoskeletal problems linked to computer use. The majority of students reported having mild pain (43.7%, moderate (24.2%, and severe (8.4% pains. Years of computer use were significantly associated with neck, shoulder and wrist pain. Using computer for internet was significantly associated with neck pain (OR=0.60; 95% CI 0.40-0.93; games: neck (OR=0.60; 95% CI 0.40-0.85 and hip/foot (OR=0.60; CI 95% 0.40-0.92, programming for elbow (OR= 1.78; CI 95% 1.10-2.94 and wrist (OR=2.25; CI 95% 1.36-3.73, while word processing was significantly associated with lower back (OR=1.45; CI 95% 1.03-2.04. Undergraduate study had a significant association with elbow pain (OR=2

  10. A report on the study of algorithms to enhance Vector computer performance for the discretized one-dimensional time-dependent heat conduction equation: EPIC research, Phase 1

    International Nuclear Information System (INIS)

    Majumdar, A.; Makowitz, H.

    1987-10-01

    With the development of modern vector/parallel supercomputers and their lower performance clones it has become possible to increase computational performance by several orders of magnitude when comparing to the previous generation of scalar computers. These performance gains are not observed when production versions of current thermal-hydraulic codes are implemented on modern supercomputers. It is our belief that this is due in part to the inappropriateness of using old thermal-hydraulic algorithms with these new computer architectures. We believe that a new generation of algorithms needs to be developed for thermal-hydraulics simulation that is optimized for vector/parallel architectures, and not the scalar computers of the previous generation. We have begun a study that will investigate several approaches for designing such optimal algorithms. These approaches are based on the following concepts: minimize recursion; utilize predictor-corrector iterative methods; maximize the convergence rate of iterative methods used; use physical approximations as well as numerical means to accelerate convergence; utilize explicit methods (i.e., marching) where stability will permit. We call this approach the ''EPIC'' methodology (i.e., Explicit Predictor Iterative Corrector methods). Utilizing the above ideas, we have begun our work by investigating the one-dimensional transient heat conduction equation. We have developed several algorithms based on variations of the Hopscotch concept, which we discuss in the body of this report. 14 refs

  11. Computer use and addiction in Romanian children and teenagers--an observational study.

    Science.gov (United States)

    Chiriţă, V; Chiriţă, Roxana; Stefănescu, C; Chele, Gabriela; Ilinca, M

    2006-01-01

    The computer has provided some wonderful opportunities for our children. Although research on the effects of children's use of computer is still ambiguous, some initial indications of positive and negative effects are beginning t emerge. They commonly use computers for playing games, completing school assignments, email, and connecting to the Internet. This may sometimes come at the expense of other activities such as homework or normal social interchange. Although most children seem to naturally correct the problem, parents and educators must monitor the signs of misuse. Studies of general computer users suggest that some children's may experience psychological problems such as social isolation, depression, loneliness, and time mismanagement related to their computer use and failure at school. The purpose of this study is to investigate issues related to computer use by school students from 11 to 18 years old. The survey included a representative sample of 439 school students of ages 11 to 18. All of the students came from 3 gymnasium schools and 5 high schools of Iaşi, Romania. The students answered to a questionnaire comprising 34 questions related to computer activities. The children's parents answered to a second questionnaire with the same subject. Most questions supposed to rate on a scale the frequency of occurrence of a certain event or issue; some questions solicited an open-answer or to choose an answer from a list. These were aimed at highlighting: (1) The frequency of computer use by the students; (2) The interference of excessive use with school performance and social life; (3) The identification of a possible computer addiction. The data was processed using the SPSS statistics software, version 11.0. Results show that the school students prefer to spend a considerable amount of time with their computers, over 3 hours/day. More than 65.7% of the students have a computer at home. More than 70% of the parents admit they do not or only occasionally

  12. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  13. Studies in Mathematics, Volume 22. Studies in Computer Science.

    Science.gov (United States)

    Pollack, Seymour V., Ed.

    The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…

  14. Partial safety factor calibration from stochastic finite element computation of welded joint with random geometries

    International Nuclear Information System (INIS)

    Schoefs, Franck; Chevreuil, Mathilde; Pasqualini, Olivier; Cazuguel, Mikaël

    2016-01-01

    Welded joints are used in various structures and infrastructures like bridges, ships and offshore structures, and are submitted to cyclic stresses. Their fatigue behaviour is an industrial key issue to deal with and still offers original research subjects. One of the available methods relies on the computing of the stress concentration factor. Even if some studies were previously driven to evaluate this factor onto some cases of welded structures, the shape of the weld joint is generally idealized through a deterministic parametric geometry. Previous experimental works however have shown that this shape plays a key role in the lifetime assessment. We propose in this paper a methodology for computing the stress concentration factor in presence of random geometries of welded joints. In view to make the results available by engineers, this method merges stochastic computation and semi-probabilistic analysis by computing partial safety factors with a dedicated method. - Highlights: • Numerical computation of stress concentration factor with random geometry of weld. • Real data are used for probabilistic modelling. • Identification of partial safety factor from SFEM computation in case of random geometries.

  15. Differences in prevalence of self-reported musculoskeletal symptoms among computer and non-computer users in a Nigerian population: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Ayanniyi O

    2010-08-01

    Full Text Available Abstract Background Literature abounds on the prevalent nature of Self Reported Musculoskeletal Symptoms (SRMS among computer users, but studies that actually compared this with non computer users are meagre thereby reducing the strength of the evidence. This study compared the prevalence of SRMS between computer and non computer users and assessed the risk factors associated with SRMS. Methods A total of 472 participants comprising equal numbers of age and sex matched computer and non computer users were assessed for the presence of SRMS. Information concerning musculoskeletal symptoms and discomforts from the neck, shoulders, upper back, elbows, wrists/hands, low back, hips/thighs, knees and ankles/feet were obtained using the Standardized Nordic questionnaire. Results The prevalence of SRMS was significantly higher in the computer users than the non computer users both over the past 7 days (χ2 = 39.11, p = 0.001 and during the past 12 month durations (χ2 = 53.56, p = 0.001. The odds of reporting musculoskeletal symptoms was least for participants above the age of 40 years (OR = 0.42, 95% CI = 0.31-0.64 over the past 7 days and OR = 0.61; 95% CI = 0.47-0.77 during the past 12 months and also reduced in female participants. Increasing daily hours and accumulated years of computer use and tasks of data processing and designs/graphics significantly (p Conclusion The prevalence of SRMS was significantly higher in the computer users than the non computer users and younger age, being male, working longer hours daily, increasing years of computer use, data entry tasks and computer designs/graphics were the significant risk factors for reporting musculoskeletal symptoms among the computer users. Computer use may explain the increase in prevalence of SRMS among the computer users.

  16. Communication: Minimum in the thermal conductivity of supercooled water: A computer simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Bresme, F., E-mail: f.bresme@imperial.ac.uk [Chemical Physics Section, Department of Chemistry, Imperial College, London SW7 2AZ, United Kingdom and Department of Chemistry, Norwegian University of Science and Technology, Trondheim 7491 (Norway); Biddle, J. W.; Sengers, J. V.; Anisimov, M. A. [Institute for Physical Science and Technology, and Department of Chemical and Biomolecular Engineering, University of Maryland, College Park, Maryland 20742 (United States)

    2014-04-28

    We report the results of a computer simulation study of the thermodynamic properties and the thermal conductivity of supercooled water as a function of pressure and temperature using the TIP4P-2005 water model. The thermodynamic properties can be represented by a two-structure equation of state consistent with the presence of a liquid-liquid critical point in the supercooled region. Our simulations confirm the presence of a minimum in the thermal conductivity, not only at atmospheric pressure, as previously found for the TIP5P water model, but also at elevated pressures. This anomalous behavior of the thermal conductivity of supercooled water appears to be related to the maximum of the isothermal compressibility or the minimum of the speed of sound. However, the magnitudes of the simulated thermal conductivities are sensitive to the water model adopted and appear to be significantly larger than the experimental thermal conductivities of real water at low temperatures.

  17. Communication: Minimum in the thermal conductivity of supercooled water: A computer simulation study

    International Nuclear Information System (INIS)

    Bresme, F.; Biddle, J. W.; Sengers, J. V.; Anisimov, M. A.

    2014-01-01

    We report the results of a computer simulation study of the thermodynamic properties and the thermal conductivity of supercooled water as a function of pressure and temperature using the TIP4P-2005 water model. The thermodynamic properties can be represented by a two-structure equation of state consistent with the presence of a liquid-liquid critical point in the supercooled region. Our simulations confirm the presence of a minimum in the thermal conductivity, not only at atmospheric pressure, as previously found for the TIP5P water model, but also at elevated pressures. This anomalous behavior of the thermal conductivity of supercooled water appears to be related to the maximum of the isothermal compressibility or the minimum of the speed of sound. However, the magnitudes of the simulated thermal conductivities are sensitive to the water model adopted and appear to be significantly larger than the experimental thermal conductivities of real water at low temperatures

  18. Computer codes for RF cavity design

    International Nuclear Information System (INIS)

    Ko, K.

    1992-01-01

    In RF cavity design, numerical modeling is assuming an increasingly important role with the help of sophisticated computer codes and powerful yet affordable computers. A description of the cavity codes in use in the accelerator community has been given previously. The present paper will address the latest developments and discuss their applications to cavity tuning and matching problems. (Author) 8 refs., 10 figs

  19. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  20. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  1. Computational atomic and nuclear physics

    International Nuclear Information System (INIS)

    Bottcher, C.; Strayer, M.R.; McGrory, J.B.

    1990-01-01

    The evolution of parallel processor supercomputers in recent years provides opportunities to investigate in detail many complex problems, in many branches of physics, which were considered to be intractable only a few years ago. But to take advantage of these new machines, one must have a better understanding of how the computers organize their work than was necessary with previous single processor machines. Equally important, the scientist must have this understanding as well as a good understanding of the structure of the physics problem under study. In brief, a new field of computational physics is evolving, which will be led by investigators who are highly literate both computationally and physically. A Center for Computationally Intensive Problems has been established with the collaboration of the University of Tennessee Science Alliance, Vanderbilt University, and the Oak Ridge National Laboratory. The objective of this Center is to carry out forefront research in computationally intensive areas of atomic, nuclear, particle, and condensed matter physics. An important part of this effort is the appropriate training of students. An early effort of this Center was to conduct a Summer School of Computational Atomic and Nuclear Physics. A distinguished faculty of scientists in atomic, nuclear, and particle physics gave lectures on the status of present understanding of a number of topics at the leading edge in these fields, and emphasized those areas where computational physics was in a position to make a major contribution. In addition, there were lectures on numerical techniques which are particularly appropriate for implementation on parallel processor computers and which are of wide applicability in many branches of science

  2. Conventional versus computer-navigated TKA: a prospective randomized study.

    Science.gov (United States)

    Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques

    2017-06-01

    The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant

  3. Computational Actuator Disc Models for Wind and Tidal Applications

    Directory of Open Access Journals (Sweden)

    B. Johnson

    2014-01-01

    Full Text Available This paper details a computational fluid dynamic (CFD study of a constantly loaded actuator disc model featuring different boundary conditions; these boundary conditions were defined to represent a channel and a duct flow. The simulations were carried out using the commercially available CFD software ANSYS-CFX. The data produced were compared to the one-dimensional (1D momentum equation as well as previous numerical and experimental studies featuring porous discs in a channel flow. The actuator disc was modelled as a momentum loss using a resistance coefficient related to the thrust coefficient (CT. The model showed good agreement with the 1D momentum theory in terms of the velocity and pressure profiles. Less agreement was demonstrated when compared to previous numerical and empirical data in terms of velocity and turbulence characteristics in the far field. These models predicted a far larger velocity deficit and a turbulence peak further downstream. This study therefore demonstrates the usefulness of the duct boundary condition (for computational ease for representing open channel flow when simulating far field effects as well as the importance of turbulence definition at the inlet.

  4. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  5. The biomechanics of running in athletes with previous hamstring injury: A case-control study.

    Science.gov (United States)

    Daly, C; Persson, U McCarthy; Twycross-Lewis, R; Woledge, R C; Morrissey, D

    2016-04-01

    Hamstring injury is prevalent with persistently high reinjury rates. We aim to inform hamstring rehabilitation by exploring the electromyographic and kinematic characteristics of running in athletes with previous hamstring injury. Nine elite male Gaelic games athletes who had returned to sport after hamstring injury and eight closely matched controls sprinted while lower limb kinematics and muscle activity of the previously injured biceps femoris, bilateral gluteus maximus, lumbar erector spinae, rectus femoris, and external oblique were recorded. Intergroup comparisons of muscle activation ratios and kinematics were performed. Previously injured athletes demonstrated significantly reduced biceps femoris muscle activation ratios with respect to ipsilateral gluteus maximus (maximum difference -12.5%, P = 0.03), ipsilateral erector spinae (maximum difference -12.5%, P = 0.01), ipsilateral external oblique (maximum difference -23%, P = 0.01), and contralateral rectus femoris (maximum difference -22%, P = 0.02) in the late swing phase. We also detected sagittal asymmetry in hip flexion (maximum 8°, P = 0.01), pelvic tilt (maximum 4°, P = 0.02), and medial rotation of the knee (maximum 6°, P = 0.03) effectively putting the hamstrings in a lengthened position just before heel strike. Previous hamstring injury is associated with altered biceps femoris associated muscle activity and potentially injurious kinematics. These deficits should be considered and addressed during rehabilitation. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. A parametric study of a solar calcinator using computational fluid dynamics

    International Nuclear Information System (INIS)

    Fidaros, D.K.; Baxevanou, C.A.; Vlachos, N.S.

    2007-01-01

    In this work a horizontal rotating solar calcinator is studied numerically using computational fluid dynamics. The specific solar reactor is a 10 kW model designed and used for efficiency studies. The numerical model is based on the solution of the Navier-Stokes equations for the gas flow, and on Lagrangean dynamics for the discrete particles. All necessary mathematical models were developed and incorporated into a computational fluid dynamics model with the influence of turbulence simulated by a two-equation (RNG k-ε) model. The efficiency of the reactor was calculated for different thermal inputs, feed rates, rotational speeds and particle diameters. The numerically computed degrees of calcination compared well with equivalent experimental results

  7. Non-unitary probabilistic quantum computing circuit and method

    Science.gov (United States)

    Williams, Colin P. (Inventor); Gingrich, Robert M. (Inventor)

    2009-01-01

    A quantum circuit performing quantum computation in a quantum computer. A chosen transformation of an initial n-qubit state is probabilistically obtained. The circuit comprises a unitary quantum operator obtained from a non-unitary quantum operator, operating on an n-qubit state and an ancilla state. When operation on the ancilla state provides a success condition, computation is stopped. When operation on the ancilla state provides a failure condition, computation is performed again on the ancilla state and the n-qubit state obtained in the previous computation, until a success condition is obtained.

  8. Computational complexity of the landscape II-Cosmological considerations

    Science.gov (United States)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  9. Dexamethasone intravitreal implant in previously treated patients with diabetic macular edema : Subgroup analysis of the MEAD study

    OpenAIRE

    Augustin, A.J.; Kuppermann, B.D.; Lanzetta, P.; Loewenstein, A.; Li, X.; Cui, H.; Hashad, Y.; Whitcup, S.M.; Abujamra, S.; Acton, J.; Ali, F.; Antoszyk, A.; Awh, C.C.; Barak, A.; Bartz-Schmidt, K.U.

    2015-01-01

    Background Dexamethasone intravitreal implant 0.7?mg (DEX 0.7) was approved for treatment of diabetic macular edema (DME) after demonstration of its efficacy and safety in the MEAD registration trials. We performed subgroup analysis of MEAD study results to evaluate the efficacy and safety of DEX 0.7 treatment in patients with previously treated DME. Methods Three-year, randomized, sham-controlled phase 3 study in patients with DME, best-corrected visual acuity (BCVA) of 34?68 Early Treatment...

  10. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  11. Computational techniques in gamma-ray skyshine analysis

    International Nuclear Information System (INIS)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified to use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs

  12. Cone beam x-ray luminescence computed tomography: a feasibility study.

    Science.gov (United States)

    Chen, Dongmei; Zhu, Shouping; Yi, Huangjian; Zhang, Xianghan; Chen, Duofang; Liang, Jimin; Tian, Jie

    2013-03-01

    The appearance of x-ray luminescence computed tomography (XLCT) opens new possibilities to perform molecular imaging by x ray. In the previous XLCT system, the sample was irradiated by a sequence of narrow x-ray beams and the x-ray luminescence was measured by a highly sensitive charge coupled device (CCD) camera. This resulted in a relatively long sampling time and relatively low utilization of the x-ray beam. In this paper, a novel cone beam x-ray luminescence computed tomography strategy is proposed, which can fully utilize the x-ray dose and shorten the scanning time. The imaging model and reconstruction method are described. The validity of the imaging strategy has been studied in this paper. In the cone beam XLCT system, the cone beam x ray was adopted to illuminate the sample and a highly sensitive CCD camera was utilized to acquire luminescent photons emitted from the sample. Photons scattering in biological tissues makes it an ill-posed problem to reconstruct the 3D distribution of the x-ray luminescent sample in the cone beam XLCT. In order to overcome this issue, the authors used the diffusion approximation model to describe the photon propagation in tissues, and employed the sparse regularization method for reconstruction. An incomplete variables truncated conjugate gradient method and permissible region strategy were used for reconstruction. Meanwhile, traditional x-ray CT imaging could also be performed in this system. The x-ray attenuation effect has been considered in their imaging model, which is helpful in improving the reconstruction accuracy. First, simulation experiments with cylinder phantoms were carried out to illustrate the validity of the proposed compensated method. The experimental results showed that the location error of the compensated algorithm was smaller than that of the uncompensated method. The permissible region strategy was applied and reduced the reconstruction error to less than 2 mm. The robustness and stability were then

  13. Using Volunteer Computing to Study Some Features of Diagonal Latin Squares

    Science.gov (United States)

    Vatutin, Eduard; Zaikin, Oleg; Kochemazov, Stepan; Valyaev, Sergey

    2017-12-01

    In this research, the study concerns around several features of diagonal Latin squares (DLSs) of small order. Authors of the study suggest an algorithm for computing minimal and maximal numbers of transversals of DLSs. According to this algorithm, all DLSs of a particular order are generated, and for each square all its transversals and diagonal transversals are constructed. The algorithm was implemented and applied to DLSs of order at most 7 on a personal computer. The experiment for order 8 was performed in the volunteer computing project Gerasim@home. In addition, the problem of finding pairs of orthogonal DLSs of order 10 was considered and reduced to Boolean satisfiability problem. The obtained problem turned out to be very hard, therefore it was decomposed into a family of subproblems. In order to solve the problem, the volunteer computing project SAT@home was used. As a result, several dozen pairs of described kind were found.

  14. CHPS IN CLOUD COMPUTING ENVIRONMENT

    OpenAIRE

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  15. A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class

    Science.gov (United States)

    Yuen, Timothy T.; Robbins, Kay A.

    2014-01-01

    Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…

  16. Study on computer-aided diagnosis of hepatic MR imaging and mammography

    International Nuclear Information System (INIS)

    Zhang Xuejun

    2005-01-01

    It is well known that the liver is an organ easily attacked by diseases. The purpose of this study is to develop a computer-aided diagnosis (CAD) scheme for helping radiologists to differentiate hepatic diseases more efficiently. Our software named LIVERANN integrated the magnetic resonance (MR) imaging findings with different pulse sequences to classify the five categories of hepatic diseases by using the artificial neural network (ANN) method. The intensity and homogeneity within the region of interest (ROI) delineated by a radiologist were automatically calculated to obtain numerical data by the program for input signals to the ANN. Outputs were the five pathological categories of hepatic diseases (hepatic cyst, hepatocellular carcinoma, dysplasia in cirrhosis, cavernous hemangioma, and metastasis). The experiment demonstrated a testing accuracy of 93% from 80 patients. In order to differentiate the cirrhosis from normal liver, the volume ratio of left to whole (LTW) was proposed to quantify the degree of cirrhosis by three-dimensional (3D) volume analysis. The liver region was firstly extracted from computed tomography (CT) or MR slices based on edge detection algorithms, and then separated into left lobe and right lobe by the hepatic umbilical fissure. The volume ratio of these two parts showed that the LTW ratio in the liver was significantly improved in the differentiation performance, with (25.6%±4.3%) in cirrhosis versus the normal liver (16.4%±5.4%). In addition, the application of the ANN method for detecting clustered microcalcifications in masses on mammograms was described here as well. A new structural ANN, so-called a shift-invariant artificial neural network (SIANN), was integrated with our triple-ring filter (TRF) method in our CAD system. As the result, the sensitivity of detecting clusters was improved from 90% by our previous TRF method to 95% by using both SIANN and TRF

  17. Computer-aided cleanup

    International Nuclear Information System (INIS)

    Williams, J.; Jones, B.

    1994-01-01

    In late 1992, the remedial investigation of operable unit 2 at the Department of Energy (DOE) Superfund site in Fernald, Ohio was in trouble. Despite years of effort--including an EPA-approved field-investigation work plan, 123 soil borings, 51 ground-water-monitoring wells, analysis of more than 650 soil and ground-water samples, and preparation of a draft remedial-investigation (RI) report--it was not possible to conclude if contaminated material in the unit was related to ground-water contamination previously detected beneath and beyond the site boundary. Compounding the problem, the schedule for the RI, feasibility study and record of decision for operable unit 2 was governed by a DOE-EPA consent agreement stipulating penalties of up to $10,000 per week for not meeting scheduled milestones--and time was running out. An advanced three-dimensional computer model confirmed that radioactive wastes dumped at the Fernald, Ohio Superfund site had contaminated ground water, after years of previous testing has been inconclusive. The system is now being used to aid feasibility and design work on the more-than-$1 billion remediation project

  18. Computational studies on energetic properties of nitrogen-rich ...

    Indian Academy of Sciences (India)

    Computational studies on energetic properties of nitrogen-rich energetic materials with ditetrazoles. LI XIAO-HONGa,b,∗ and ZHANG RUI-ZHOUa. aCollege of Physics and Engineering, Henan University of Science and Technology, Luoyang 471 003, China. bLuoyang Key Laboratory of Photoelectric Functional Materials, ...

  19. Temporal trends in compliance with appropriateness criteria for stress single-photon emission computed tomography sestamibi studies in an academic medical center.

    Science.gov (United States)

    Gibbons, Raymond J; Askew, J Wells; Hodge, David; Miller, Todd D

    2010-03-01

    The purpose of this study was to apply published appropriateness criteria for single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) in a single academic medical center to determine if the percentage of inappropriate studies was changing over time. In a previous study, we applied the American College of Cardiology Foundation/American Society of Nuclear Cardiology (ASNC) appropriateness criteria for stress SPECT MPI and reported that 14% of stress SPECT studies were performed for inappropriate reasons. Using similar methodology, we retrospectively examined 284 patients who underwent stress SPECT MPI in October 2006 and compared the findings to the previous cohort of 284 patients who underwent stress SPECT MPI in May 2005. The indications for testing in the 2 cohorts were very similar. The overall level of agreement in characterizing categories of appropriateness between 2 experienced cardiovascular nurse abstractors was good (kappa = 0.68), which represented an improvement from our previous study (kappa = 0.56). There was a significant change between May 2005 and October 2006 in the overall classification of categories for appropriateness (P = .024 by chi(2) statistic). There were modest, but insignificant, increases in the number of patients who were unclassified (15% in the current study vs 11% previously), appropriate (66% vs 64%), and uncertain (12% vs 11%). Only 7% of the studies in the current study were inappropriate, which represented a significant (P = .004) decrease from the 14% reported in the 2005 cohort. In the absence of any specific intervention, there was a significant change in the overall classification of SPECT appropriateness in an academic medical center over 17 months. The only significant difference in individual categories was a decrease in inappropriate studies. Additional measurements over time will be required to determine if this trend is sustainable or generalizable.

  20. Understanding initial undergraduate expectations and identity in computing studies

    Science.gov (United States)

    Kinnunen, Päivi; Butler, Matthew; Morgan, Michael; Nylen, Aletta; Peters, Anne-Kathrin; Sinclair, Jane; Kalvala, Sara; Pesonen, Erkki

    2018-03-01

    There is growing appreciation of the importance of understanding the student perspective in Higher Education (HE) at both institutional and international levels. This is particularly important in Science, Technology, Engineering and Mathematics subjects such as Computer Science (CS) and Engineering in which industry needs are high but so are student dropout rates. An important factor to consider is the management of students' initial expectations of university study and career. This paper reports on a study of CS first-year students' expectations across three European countries using qualitative data from student surveys and essays. Expectation is examined from both short-term (topics to be studied) and long-term (career goals) perspectives. Tackling these issues will help paint a picture of computing education through students' eyes and explore their vision of its and their role in society. It will also help educators prepare students more effectively for university study and to improve the student experience.

  1. Computer simulation of radiographic images sharpness in several system of image record

    International Nuclear Information System (INIS)

    Silva, Marcia Aparecida; Schiable, Homero; Frere, Annie France; Marques, Paulo M.A.; Oliveira, Henrique J.Q. de; Alves, Fatima F.R.; Medeiros, Regina B.

    1996-01-01

    A method to predict the influence of the record system on radiographic images sharpness by computer simulation is studied. The method intend to previously show the image to be obtained for each type of film or screen-film combination used during the exposure

  2. Efficient computation method of Jacobian matrix

    International Nuclear Information System (INIS)

    Sasaki, Shinobu

    1995-05-01

    As well known, the elements of the Jacobian matrix are complex trigonometric functions of the joint angles, resulting in a matrix of staggering complexity when we write it all out in one place. This article addresses that difficulties to this subject are overcome by using velocity representation. The main point is that its recursive algorithm and computer algebra technologies allow us to derive analytical formulation with no human intervention. Particularly, it is to be noted that as compared to previous results the elements are extremely simplified throughout the effective use of frame transformations. Furthermore, in case of a spherical wrist, it is shown that the present approach is computationally most efficient. Due to such advantages, the proposed method is useful in studying kinematically peculiar properties such as singularity problems. (author)

  3. Challenging previous conceptions of vegetarianism and eating disorders.

    Science.gov (United States)

    Fisak, B; Peterson, R D; Tantleff-Dunn, S; Molnar, J M

    2006-12-01

    The purpose of this study was to replicate and expand upon previous research that has examined the potential association between vegetarianism and disordered eating. Limitations of previous research studies are addressed, including possible low reliability of measures of eating pathology within vegetarian samples, use of only a few dietary restraint measures, and a paucity of research examining potential differences in body image and food choice motives of vegetarians versus nonvegetarians. Two hundred and fifty-six college students completed a number of measures of eating pathology and body image, and a food choice motives questionnaire. Interestingly, no significant differences were found between vegetarians and nonvegetarians in measures of eating pathology or body image. However, significant differences in food choice motives were found. Implications for both researchers and clinicians are discussed.

  4. A portable grid-enabled computing system for a nuclear material study

    International Nuclear Information System (INIS)

    Tsujita, Yuichi; Arima, Tatsumi; Takekawa, Takayuki; Suzuki, Yoshio

    2010-01-01

    We have built a portable grid-enabled computing system specialized for our molecular dynamics (MD) simulation program to study Pu material easily. Experimental approach to reveal properties of Pu materials is often accompanied by some difficulties such as radiotoxicity of actinides. Since a computational approach reveals new aspects to researchers without such radioactive facilities, we address an MD computation. In order to have more realistic results about e.g., melting point or thermal conductivity, we need a large scale of parallel computations. Most of application users who don't have supercomputers in their institutes should use a remote supercomputer. For such users, we have developed the portable and secured grid-enabled computing system to utilize a grid computing infrastructure provided by Information Technology Based Laboratory (ITBL). This system enables us to access remote supercomputers in the ITBL system seamlessly from a client PC through its graphical user interface (GUI). Typically it enables seamless file accesses on the GUI. Furthermore monitoring of standard output or standard error is available to see progress of an executed program. Since the system provides fruitful functionalities which are useful for parallel computing on a remote supercomputer, application users can concentrate on their researches. (author)

  5. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    Science.gov (United States)

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  6. Computers and internet in dental education system of Kerala, South India: A multicentre study

    Directory of Open Access Journals (Sweden)

    Kanakath Harikumar

    2015-01-01

    Full Text Available Computers and internet have exerted a tremendous effect on dental education programs all over the world. A multicenter study was done to assess trends in computer and internet usage among the dental students and faculty members across the South Indian state, Kerala. A total of 347 subjects participated in the study. All participants were highly competent with the use of computers and internet. 72.3% of the study subjects preferred hard copy textbooks to PDF format books. 81.3% of the study subjects thought that internet was a useful adjunct to dental education. 73.8% of the study subjects opined that computers and internet could never be a replacement to conventional classroom teaching. Efforts should be made to provide greater infrastructure with regard to computers and internet such as Wi-Fi, free, unlimited internet access to all students and faculty members.

  7. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  8. Two Studies Examining Argumentation in Asynchronous Computer Mediated Communication

    Science.gov (United States)

    Joiner, Richard; Jones, Sarah; Doherty, John

    2008-01-01

    Asynchronous computer mediated communication (CMC) would seem to be an ideal medium for supporting development in student argumentation. This paper investigates this assumption through two studies. The first study compared asynchronous CMC with face-to-face discussions. The transactional and strategic level of the argumentation (i.e. measures of…

  9. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  10. A previous hamstring injury affects kicking mechanics in soccer players.

    Science.gov (United States)

    Navandar, Archit; Veiga, Santiago; Torres, Gonzalo; Chorro, David; Navarro, Enrique

    2018-01-10

    Although the kicking skill is influenced by limb dominance and sex, how a previous hamstring injury affects kicking has not been studied in detail. Thus, the objective of this study was to evaluate the effect of sex and limb dominance on kicking in limbs with and without a previous hamstring injury. 45 professional players (males: n=19, previously injured players=4, age=21.16 ± 2.00 years; females: n=19, previously injured players=10, age=22.15 ± 4.50 years) performed 5 kicks each with their preferred and non-preferred limb at a target 7m away, which were recorded with a three-dimensional motion capture system. Kinematic and kinetic variables were extracted for the backswing, leg cocking, leg acceleration and follow through phases. A shorter backswing (20.20 ± 3.49% vs 25.64 ± 4.57%), and differences in knee flexion angle (58 ± 10o vs 72 ± 14o) and hip flexion velocity (8 ± 0rad/s vs 10 ± 2rad/s) were observed in previously injured, non-preferred limb kicks for females. A lower peak hip linear velocity (3.50 ± 0.84m/s vs 4.10 ± 0.45m/s) was observed in previously injured, preferred limb kicks of females. These differences occurred in the backswing and leg-cocking phases where the hamstring muscles were the most active. A variation in the functioning of the hamstring muscles and that of the gluteus maximus and iliopsoas in the case of a previous injury could account for the differences observed in the kicking pattern. Therefore, the effects of a previous hamstring injury must be considered while designing rehabilitation programs to re-educate kicking movement.

  11. Emerging Trends in Heart Valve Engineering: Part IV. Computational Modeling and Experimental Studies.

    Science.gov (United States)

    Kheradvar, Arash; Groves, Elliott M; Falahatpisheh, Ahmad; Mofrad, Mohammad K; Hamed Alavi, S; Tranquillo, Robert; Dasi, Lakshmi P; Simmons, Craig A; Jane Grande-Allen, K; Goergen, Craig J; Baaijens, Frank; Little, Stephen H; Canic, Suncica; Griffith, Boyce

    2015-10-01

    In this final portion of an extensive review of heart valve engineering, we focus on the computational methods and experimental studies related to heart valves. The discussion begins with a thorough review of computational modeling and the governing equations of fluid and structural interaction. We then move onto multiscale and disease specific modeling. Finally, advanced methods related to in vitro testing of the heart valves are reviewed. This section of the review series is intended to illustrate application of computational methods and experimental studies and their interrelation for studying heart valves.

  12. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I. [Lawrence Livermore National Lab., CA (United States)

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  13. Pulmonary artery aneurysm in Bechcet's disease: helical computed tomography study

    International Nuclear Information System (INIS)

    Munoz, J.; Caballero, P.; Olivera, M. J.; Cajal, M. L.; Caniego, J. L.

    2000-01-01

    Behcet's disease is a vasculitis of unknown etiology that affects arteries and veins of different sizes and can be associated with pulmonary artery aneurysms. We report the case of a patient with Behcet's disease and a pulmonary artery aneurysm who was studied by means of plain chest X ray, helical computed tomography and pulmonary arteriography. Helical computed tomography is a reliable technique for the diagnosis and follow-up of these patients. (Author) 9 refs

  14. Using Computational and Mechanical Models to Study Animal Locomotion

    Science.gov (United States)

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  15. Doctors' experience with handheld computers in clinical practice: qualitative study.

    Science.gov (United States)

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-05-15

    To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.

  16. Students and Taxes: a Privacy-Preserving Study Using Secure Computation

    Directory of Open Access Journals (Sweden)

    Bogdanov Dan

    2016-07-01

    Full Text Available We describe the use of secure multi-party computation for performing a large-scale privacy-preserving statistical study on real government data. In 2015, statisticians from the Estonian Center of Applied Research (CentAR conducted a big data study to look for correlations between working during university studies and failing to graduate in time. The study was conducted by linking the database of individual tax payments from the Estonian Tax and Customs Board and the database of higher education events from the Ministry of Education and Research. Data collection, preparation and analysis were conducted using the Share-mind secure multi-party computation system that provided end-to-end cryptographic protection to the analysis. Using ten million tax records and half a million education records in the analysis, this is the largest cryptographically private statistical study ever conducted on real data.

  17. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: A cross-sectional study

    Science.gov (United States)

    2010-01-01

    Background The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Methods Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. Results To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability. PMID:20064250

  18. Computational study of noise in a large signal transduction network

    Directory of Open Access Journals (Sweden)

    Ruohonen Keijo

    2011-06-01

    Full Text Available Abstract Background Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. Results We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. Conclusions We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies.

  19. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  20. No discrimination against previous mates in a sexually cannibalistic spider

    Science.gov (United States)

    Fromhage, Lutz; Schneider, Jutta M.

    2005-09-01

    In several animal species, females discriminate against previous mates in subsequent mating decisions, increasing the potential for multiple paternity. In spiders, female choice may take the form of selective sexual cannibalism, which has been shown to bias paternity in favor of particular males. If cannibalistic attacks function to restrict a male's paternity, females may have little interest to remate with males having survived such an attack. We therefore studied the possibility of female discrimination against previous mates in sexually cannibalistic Argiope bruennichi, where females almost always attack their mate at the onset of copulation. We compared mating latency and copulation duration of males having experienced a previous copulation either with the same or with a different female, but found no evidence for discrimination against previous mates. However, males copulated significantly shorter when inserting into a used, compared to a previously unused, genital pore of the female.

  1. Abiraterone in metastatic prostate cancer without previous chemotherapy

    NARCIS (Netherlands)

    Ryan, Charles J.; Smith, Matthew R.; de Bono, Johann S.; Molina, Arturo; Logothetis, Christopher J.; de Souza, Paul; Fizazi, Karim; Mainwaring, Paul; Piulats, Josep M.; Ng, Siobhan; Carles, Joan; Mulders, Peter F. A.; Basch, Ethan; Small, Eric J.; Saad, Fred; Schrijvers, Dirk; van Poppel, Hendrik; Mukherjee, Som D.; Suttmann, Henrik; Gerritsen, Winald R.; Flaig, Thomas W.; George, Daniel J.; Yu, Evan Y.; Efstathiou, Eleni; Pantuck, Allan; Winquist, Eric; Higano, Celestia S.; Taplin, Mary-Ellen; Park, Youn; Kheoh, Thian; Griffin, Thomas; Scher, Howard I.; Rathkopf, Dana E.; Boyce, A.; Costello, A.; Davis, I.; Ganju, V.; Horvath, L.; Lynch, R.; Marx, G.; Parnis, F.; Shapiro, J.; Singhal, N.; Slancar, M.; van Hazel, G.; Wong, S.; Yip, D.; Carpentier, P.; Luyten, D.; de Reijke, T.

    2013-01-01

    Abiraterone acetate, an androgen biosynthesis inhibitor, improves overall survival in patients with metastatic castration-resistant prostate cancer after chemotherapy. We evaluated this agent in patients who had not received previous chemotherapy. In this double-blind study, we randomly assigned

  2. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2012-01-31

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  3. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2011-01-01

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  4. Computer vision syndrome: a study of knowledge and practices in university students.

    Science.gov (United States)

    Reddy, S C; Low, C K; Lim, Y P; Low, L L; Mardina, F; Nursaleha, M P

    2013-01-01

    Computer vision syndrome (CVS) is a condition in which a person experiences one or more of eye symptoms as a result of prolonged working on a computer. To determine the prevalence of CVS symptoms, knowledge and practices of computer use in students studying in different universities in Malaysia, and to evaluate the association of various factors in computer use with the occurrence of symptoms. In a cross sectional, questionnaire survey study, data was collected in college students regarding the demography, use of spectacles, duration of daily continuous use of computer, symptoms of CVS, preventive measures taken to reduce the symptoms, use of radiation filter on the computer screen, and lighting in the room. A total of 795 students, aged between 18 and 25 years, from five universities in Malaysia were surveyed. The prevalence of symptoms of CVS (one or more) was found to be 89.9%; the most disturbing symptom was headache (19.7%) followed by eye strain (16.4%). Students who used computer for more than 2 hours per day experienced significantly more symptoms of CVS (p=0.0001). Looking at far objects in-between the work was significantly (p=0.0008) associated with less frequency of CVS symptoms. The use of radiation filter on the screen (p=0.6777) did not help in reducing the CVS symptoms. Ninety percent of university students in Malaysia experienced symptoms related to CVS, which was seen more often in those who used computer for more than 2 hours continuously per day. © NEPjOPH.

  5. Computed tomography evaluation of the sacroiliac joints in Crohn disease

    International Nuclear Information System (INIS)

    Scott, W.W. Jr.; Fishman, E.K.; Kuhlman, J.E.; Caskey, C.I.; O'Brien, J.J.; Walia, G.S.; Bayless, T.M.

    1990-01-01

    Computed tomography (CT) was used in a prospective study of the sacroiliac joints in 86 patients with Crohn disease to determine the type and frequency of sacroiliac joint abnormalities present in this population. The CT findings were correlated with review of the clinical history in 64 patients. Computed tomography demonstrated changes of sacroiliitis in 29% of the study group. This high prevalence of sacroiliac joint abnormality was found even in those under 30 years of age. It exceeds the 11-19% previously reported from plain film examination, reflecting the greater sensitivity of CT. In the subgroup of 64 patients studied clinically, 19 (30%) had abnormal sacroiliac joints on CT, but only 2 (3%) reported symptoms related to the sacroiliac joints. (orig.)

  6. Biased ART: a neural architecture that shifts attention toward previously disregarded features following an incorrect prediction.

    Science.gov (United States)

    Carpenter, Gail A; Gaddam, Sai Chaitanya

    2010-04-01

    Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Two-dimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  8. An Introduction to Quantum Computing, Without the Physics

    OpenAIRE

    Nannicini, Giacomo

    2017-01-01

    This paper is a gentle but rigorous introduction to quantum computing intended for discrete mathematicians. Starting from a small set of assumptions on the behavior of quantum computing devices, we analyze their main characteristics, stressing the differences with classical computers, and finally describe two well-known algorithms (Simon's algorithm and Grover's algorithm) using the formalism developed in previous sections. This paper does not touch on the physics of the devices, and therefor...

  9. Toward Confirming a Framework for Securing the Virtual Machine Image in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Raid Khalid Hussein

    2017-04-01

    Full Text Available The concept of cloud computing has arisen thanks to academic work in the fields of utility computing, distributed computing, virtualisation, and web services. By using cloud computing, which can be accessed from anywhere, newly-launched businesses can minimise their start-up costs. Among the most important notions when it comes to the construction of cloud computing is virtualisation. While this concept brings its own security risks, these risks are not necessarily related to the cloud. The main disadvantage of using cloud computing is linked to safety and security. This is because anybody which chooses to employ cloud computing will use someone else’s hard disk and CPU in order to sort and store data. In cloud environments, a great deal of importance is placed on guaranteeing that the virtual machine image is safe and secure. Indeed, a previous study has put forth a framework with which to protect the virtual machine image in cloud computing. As such, the present study is primarily concerned with confirming this theoretical framework so as to ultimately secure the virtual machine image in cloud computing. This will be achieved by carrying out interviews with experts in the field of cloud security.

  10. Proposing Hybrid Architecture to Implement Cloud Computing in Higher Education Institutions Using a Meta-synthesis Appro

    Directory of Open Access Journals (Sweden)

    hamid reza bazi

    2017-12-01

    Full Text Available Cloud computing is a new technology that considerably helps Higher Education Institutions (HEIs to develop and create competitive advantage with inherent characteristics such as flexibility, scalability, accessibility, reliability, fault tolerant and economic efficiency. Due to the numerous advantages of cloud computing, and in order to take advantage of cloud computing infrastructure, services of universities and HEIs need to migrate to the cloud. However, this transition involves many challenges, one of which is lack or shortage of appropriate architecture for migration to the technology. Using a reliable architecture for migration ensures managers to mitigate risks in the cloud computing technology. Therefore, organizations always search for suitable cloud computing architecture. In previous studies, these important features have received less attention and have not been achieved in a comprehensive way. The aim of this study is to use a meta-synthesis method for the first time to analyze the previously published studies and to suggest appropriate hybrid cloud migration architecture (IUHEC. We reviewed many papers from relevant journals and conference proceedings. The concepts extracted from these papers are classified to related categories and sub-categories. Then, we developed our proposed hybrid architecture based on these concepts and categories. The proposed architecture was validated by a panel of experts and Lawshe’s model was used to determine the content validity. Due to its innovative yet user-friendly nature, comprehensiveness, and high security, this architecture can help HEIs have an effective migration to cloud computing environment.

  11. How Computer Music Modeling and Retrieval Interface with Music-And-Meaning Studies

    DEFF Research Database (Denmark)

    Grund, Cynthia M.

    2007-01-01

      Inspired by the interest generated as the result of a panel discussion dealing with cross-disciplinarity and computer music modeling and retrieval (CMMR) at CMMR 2005 - "Play!" - in Pisa, a panel discussion on  current issues of a cross-disciplinary character has been organized for ICMC07/CMMR...... 2007. Eight panelists will be dealing with the two questions: a) What are current issues within music-and-meaning studies, the examination of which mandates development of new techniques within computer music modeling and retrieval?  and b) Do current techniques within computer music modeling...... and retrieval give rise to new questions within music-and-meaning studies?...

  12. A longitudinal study of plasma insulin and glucagon in women with previous gestational diabetes

    DEFF Research Database (Denmark)

    Damm, P; Kühl, C; Hornnes, P

    1995-01-01

    OBJECTIVE: To investigate whether plasma insulin or glucagon predicts later development of diabetes in women with gestational diabetes mellitus (GDM). RESEARCH DESIGN AND METHODS: The subjects studied were 91 women with diet-treated GDM and 33 healthy women. Plasma insulin and glucagon during a 50...... at follow-up (2 had insulin-dependent diabetes mellitus, 13 had non-insulin-dependent diabetes mellitus, and 12 had impaired glucose tolerance). Compared with the control subjects, women with previous GDM had relatively impaired insulin secretion (decreased insulinogenic index and delayed peak insulin...... for subsequent development of overt diabetes (logistic regression analysis). CONCLUSIONS: Women who develop GDM have a relative insulin secretion deficiency, the severity of which is predictive for later development of diabetes. Furthermore, our data indicate that their relatively reduced beta-cell function may...

  13. A checkpoint compression study for high-performance computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Ibtesham, Dewan [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Computer Science; Ferreira, Kurt B. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Scalable System Software Dept.; Arnold, Dorian [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Computer Science

    2015-02-17

    As high-performance computing systems continue to increase in size and complexity, higher failure rates and increased overheads for checkpoint/restart (CR) protocols have raised concerns about the practical viability of CR protocols for future systems. Previously, compression has proven to be a viable approach for reducing checkpoint data volumes and, thereby, reducing CR protocol overhead leading to improved application performance. In this article, we further explore compression-based CR optimization by exploring its baseline performance and scaling properties, evaluating whether improved compression algorithms might lead to even better application performance and comparing checkpoint compression against and alongside other software- and hardware-based optimizations. Our results highlights are: (1) compression is a very viable CR optimization; (2) generic, text-based compression algorithms appear to perform near optimally for checkpoint data compression and faster compression algorithms will not lead to better application performance; (3) compression-based optimizations fare well against and alongside other software-based optimizations; and (4) while hardware-based optimizations outperform software-based ones, they are not as cost effective.

  14. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  15. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy

    OpenAIRE

    Morimoto, Satoshi; Remijn, Gerard B.; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects ...

  16. DUBNA-GRAN SASSO: Satellite computer link

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    In April a 64 kbit/s computer communication link was set up between the Joint Institute for Nuclear Research (JINR), Dubna (Russia) and Gran Sasso (Italy) Laboratories via nearby ground satellite stations using the INTELSAT V satellite. Previously the international community of Dubna's experimentalists and theorists (high energy physics, condensed matter physics, low energy nuclear and neutron physics, accelerator and applied nuclear physics) had no effective computer links with scientific centres worldwide

  17. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  18. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    Directory of Open Access Journals (Sweden)

    Hua KL

    2015-08-01

    Full Text Available Kai-Lung Hua,1 Che-Hao Hsu,1 Shintami Chusnul Hidayati,1 Wen-Huang Cheng,2 Yu-Jen Chen3 1Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, 2Research Center for Information Technology Innovation, Academia Sinica, 3Department of Radiation Oncology, MacKay Memorial Hospital, Taipei, Taiwan Abstract: Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. Keywords: nodule classification, deep learning, deep belief network, convolutional neural network

  19. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  20. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  1. Asymmetric energy flow in liquid alkylbenzenes: A computational study

    International Nuclear Information System (INIS)

    Leitner, David M.; Pandey, Hari Datt

    2015-01-01

    Ultrafast IR-Raman experiments on substituted benzenes [B. C. Pein et al., J. Phys. Chem. B 117, 10898–10904 (2013)] reveal that energy can flow more efficiently in one direction along a molecule than in others. We carry out a computational study of energy flow in the three alkyl benzenes, toluene, isopropylbenzene, and t-butylbenzene, studied in these experiments, and find an asymmetry in the flow of vibrational energy between the two chemical groups of the molecule due to quantum mechanical vibrational relaxation bottlenecks, which give rise to a preferred direction of energy flow. We compare energy flow computed for all modes of the three alkylbenzenes over the relaxation time into the liquid with energy flow through the subset of modes monitored in the time-resolved Raman experiments and find qualitatively similar results when using the subset compared to all the modes

  2. A study about the interest and previous contact of high school students with Astronomy

    Science.gov (United States)

    Carvalho, C. L.; Zanitti, M. H. R.; Felicidade, B. L.; Gomes, A. D. T.; Dias, E. W.; Coelho, F. O.

    2016-04-01

    The currently problems in Astronomy teaching in Brazilian Basic Education contrast with the space, and the popularity that astronomical themes have in various media in the country. In this work, we present the results of a study about the interest, and previous contact of high school students from a public school in the city of "São João del-Rei"/MG with topics related to Astronomy. The study and the pedagogical intervention were carried out by students of the PIBID/CAPES/UFSJ. The intervention was performed through an oral exposition with the students' participation, followed by the use of the Stellarium program. The results suggest the majority of students surveyed are interested in Astronomy, and have had some contact with the area. However, some inconsistencies in their responses were identified and examined. The implications for research and for Astronomy Education are discussed. We also make some considerations about relationship between the lack of specific knowledge and the misinformation as one possible reason for the little interest of students in various areas of Science.

  3. Strictly contractive quantum channels and physically realizable quantum computers

    International Nuclear Information System (INIS)

    Raginsky, Maxim

    2002-01-01

    We study the robustness of quantum computers under the influence of errors modeled by strictly contractive channels. A channel T is defined to be strictly contractive if, for any pair of density operators ρ, σ in its domain, parallel Tρ-Tσ parallel 1 ≤k parallel ρ-σ parallel 1 for some 0≤k 1 denotes the trace norm). In other words, strictly contractive channels render the states of the computer less distinguishable in the sense of quantum detection theory. Starting from the premise that all experimental procedures can be carried out with finite precision, we argue that there exists a physically meaningful connection between strictly contractive channels and errors in physically realizable quantum computers. We show that, in the absence of error correction, sensitivity of quantum memories and computers to strictly contractive errors grows exponentially with storage time and computation time, respectively, and depends only on the constant k and the measurement precision. We prove that strict contractivity rules out the possibility of perfect error correction, and give an argument that approximate error correction, which covers previous work on fault-tolerant quantum computation as a special case, is possible

  4. Ifosfamide in previously untreated disseminated neuroblastoma. Results of Study 3A of the European Neuroblastoma Study Group.

    Science.gov (United States)

    Kellie, S J; De Kraker, J; Lilleyman, J S; Bowman, A; Pritchard, J

    1988-05-01

    A prospective study of the effectiveness of ifosfamide as a single agent in the management of previously untreated patients with Evans stage IV neuroblastoma was undertaken. Eighteen children aged more than 1 year were treated with ifosfamide (IFX) 3 g/m2 daily for 2 days immediately after diagnosis and 3 weeks later. Treatment was continued with combination chemotherapy using vincristine, cyclophosphamide, cisplatinum and etoposide (OPEC) or a variant. Mesna (2-mercaptoethane sulphonate) was given to all patients during IFX treatment to prevent urotoxicity. Eight of the 18 patients (44%) responded to IFX. Nine had greater than 66% reduction in baseline tumor volume. Of 15 evaluable patients with raised pre-treatment urinary catecholamine excretion, six (40%) achieved greater than 50% reduction in pretreatment levels. Two of 10 patients evaluable for bone marrow response had complete clearance. Toxicity was mild in all patients. Upon completing 'first line' therapy, only four patients (22%) achieved a good partial remission (GPR) or complete response (CR). Median survival was 11 months. There was a lower rate of attaining GPR and shortened median survival in patients receiving phase II IFX before OPEC or variant, compared to patients with similar pre-treatment characteristics treated with OPEC from diagnosis in an earlier study.

  5. Reverse logistics system planning for recycling computers hardware: A case study

    Science.gov (United States)

    Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar

    2014-09-01

    This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.

  6. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  7. FDG-PET and CT patterns of bone metastases and their relationship to previously administered anti-cancer therapy

    International Nuclear Information System (INIS)

    Israel, Ora; Bar-Shalom, Rachel; Keidar, Zohar; Goldberg, Anat; Nachtigal, Alicia; Militianu, Daniela; Fogelman, Ignac

    2006-01-01

    To assess 18 F-fluorodeoxyglucose (FDG) uptake in bone metastases in patients with and without previous treatment, and compare positive positron emission tomography (PET) with osteolytic or osteoblastic changes on computed tomography (CT). One hundred and thirty-one FDG-PET/CT studies were reviewed for bone metastases. A total of 294 lesions were found in 76 patients, 81 in untreated patients and 213 in previously treated patients. PET was assessed for abnormal FDG uptake localised by PET/CT to the skeleton. CT was evaluated for bone metastases and for blastic or lytic pattern. The relationship between the presence and pattern of bone metastases on PET and CT, and prior treatment was statistically analysed using the chi-square test. PET identified 174 (59%) metastases, while CT detected 280 (95%). FDG-avid metastases included 74/81 (91%) untreated and 100/213 (47%) treated lesions (p<0.001). On CT there were 76/81 (94%) untreated and 204/213 (96%) treated metastases (p NS). In untreated patients, 85% of lesions were seen on both PET and CT (26 blastic, 43 lytic). In treated patients, 53% of lesions were seen only on CT (95 blastic, 18 lytic). Of the osteoblastic metastases, 65/174 (37%) were PET positive and 98/120 (82%), PET negative (p<0.001). The results of the present study indicate that when imaging bone metastases, prior treatment can alter the relationship between PET and CT findings. Most untreated bone metastases are PET positive and lytic on CT, while in previously treated patients most lesions are PET negative and blastic on CT. PET and CT therefore appear to be complementary in the assessment of bone metastases. (orig.)

  8. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  9. HIGH-PERFORMANCE COMPUTING FOR THE STUDY OF EARTH AND ENVIRONMENTAL SCIENCE MATERIALS USING SYNCHROTRON X-RAY COMPUTED MICROTOMOGRAPHY

    International Nuclear Information System (INIS)

    FENG, H.; JONES, K.W.; MCGUIGAN, M.; SMITH, G.J.; SPILETIC, J.

    2001-01-01

    Synchrotron x-ray computed microtomography (CMT) is a non-destructive method for examination of rock, soil, and other types of samples studied in the earth and environmental sciences. The high x-ray intensities of the synchrotron source make possible the acquisition of tomographic volumes at a high rate that requires the application of high-performance computing techniques for data reconstruction to produce the three-dimensional volumes, for their visualization, and for data analysis. These problems are exacerbated by the need to share information between collaborators at widely separated locations over both local and tide-area networks. A summary of the CMT technique and examples of applications are given here together with a discussion of the applications of high-performance computing methods to improve the experimental techniques and analysis of the data

  10. HIGH-PERFORMANCE COMPUTING FOR THE STUDY OF EARTH AND ENVIRONMENTAL SCIENCE MATERIALS USING SYNCHROTRON X-RAY COMPUTED MICROTOMOGRAPHY.

    Energy Technology Data Exchange (ETDEWEB)

    FENG,H.; JONES,K.W.; MCGUIGAN,M.; SMITH,G.J.; SPILETIC,J.

    2001-10-12

    Synchrotron x-ray computed microtomography (CMT) is a non-destructive method for examination of rock, soil, and other types of samples studied in the earth and environmental sciences. The high x-ray intensities of the synchrotron source make possible the acquisition of tomographic volumes at a high rate that requires the application of high-performance computing techniques for data reconstruction to produce the three-dimensional volumes, for their visualization, and for data analysis. These problems are exacerbated by the need to share information between collaborators at widely separated locations over both local and tide-area networks. A summary of the CMT technique and examples of applications are given here together with a discussion of the applications of high-performance computing methods to improve the experimental techniques and analysis of the data.

  11. Computational Study of Magic-Size CdSe Clusters with Complementary Passivation by Carboxylic and Amine Ligands

    KAUST Repository

    Voznyy, Oleksandr; Mokkath, Junais Habeeb; Jain, Ankit; Sargent, Edward H.; Schwingenschlö gl, Udo

    2016-01-01

    The electronic and optical properties of tetrahedral CdSe magic clusters (average diameter.5 nm) protected by carboxyl and amine ligands, which correspond to previously reported experimental structures, are studied using density functional theory. We find extreme ligand packing densities, capping every single dangling bond of the inorganic core, strong dependence of the Z-type metal carboxylate binding on the amount of excess amine, and potential for improved photoluminescence upon replacing phenyl ligands with alkanes. The computed absorption spectra of the Cd35Se20 cluster agree well with experiments, resolving the 0.2 eV splitting of the first exciton peak due to spin-orbit coupling. We discuss the origin of the significant broadening of the optical spectra as due to phonons and structural variations in the ligand configurations and inorganic core apexes. © 2016 American Chemical Society.

  12. Computational Study of Magic-Size CdSe Clusters with Complementary Passivation by Carboxylic and Amine Ligands

    KAUST Repository

    Voznyy, Oleksandr

    2016-04-28

    The electronic and optical properties of tetrahedral CdSe magic clusters (average diameter.5 nm) protected by carboxyl and amine ligands, which correspond to previously reported experimental structures, are studied using density functional theory. We find extreme ligand packing densities, capping every single dangling bond of the inorganic core, strong dependence of the Z-type metal carboxylate binding on the amount of excess amine, and potential for improved photoluminescence upon replacing phenyl ligands with alkanes. The computed absorption spectra of the Cd35Se20 cluster agree well with experiments, resolving the 0.2 eV splitting of the first exciton peak due to spin-orbit coupling. We discuss the origin of the significant broadening of the optical spectra as due to phonons and structural variations in the ligand configurations and inorganic core apexes. © 2016 American Chemical Society.

  13. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration; Velikhov, Vasily; Konoplich, Rostislav

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in the modern high energy physics. It is important to notice, that GRID computing resources become strictly limited due to increasing amount of statistics, required for physics analyses and unprecedented LHC performance. One of the possibilities to address the shortfall of computing resources is the usage of computer institutes' clusters, commercial computing resources and supercomputers. To perform precision measurements of the Higgs boson properties in these realities, it is also highly required to have effective instruments to simulate kinematic distributions of signal events. In this talk we give a brief description of the modern distribution reconstruction method called Morphing and perform few efficiency tests to demonstrate its potential. These studies have been performed on the WLCG and Kurchatov Institute’s Data Processing Center, including Tier-1 GRID site and supercomputer as well. We also analyze the CPU efficienc...

  14. Computer-Based Job and Occupational Data Collection Methods: Feasibility Study

    National Research Council Canada - National Science Library

    Mitchell, Judith I

    1998-01-01

    .... The feasibility study was conducted to assess the operational and logistical problems involved with the development, implementation, and evaluation of computer-based job and occupational data collection methods...

  15. Youth suicide: an insight into previous hospitalisation for injury and sociodemographic conditions from a nationwide cohort study.

    Science.gov (United States)

    Zambon, Francesco; Laflamme, Lucie; Spolaore, Paolo; Visentin, Cristiana; Hasselberg, Marie

    2011-06-01

    This study investigates the degree to which a previous hospitalisation for injury of any intent is a risk of subsequent youth suicide and whether this association is influenced by family socioeconomic status or economic stress. A nationwide register-based cohort study was conducted covering all Swedish subjects born between January 1977 and December 1991 (N=1,616,342, male/female ratio=1.05). The cohort subjects were followed-up from January 1998 to December 2003, when aged 7-26 years. Poisson regression and the likelihood ratio test (95% CI) were used to assess the age-adjusted effect of hospitalisation for injuries of various intent on youth suicide and its effect once adjusted for family sociodemographic and social circumstances. Each set of exposures was associated independently and significantly with suicide mortality. Being hospitalised for self-inflicted injuries or injuries of undetermined intent was associated with a risk of suicide 36 to 47 times, respectively, that of subjects never hospitalised in the period under study (95% CI 28.36 to 45.58 and 26.67 to 83.87 for self-inflicted injuries and for events of undetermined intent, respectively; overall psuicide (RR 3.08; 95% CI 2.26 to 4.19). These effects were solid and not substantially altered after adjustment for family demographic and socioeconomic circumstances. A strong association exists between previous hospitalisation for injury of any intent and youth suicide. The association is robust and unaltered by family socioeconomic circumstances.

  16. Computers in nuclear medicine - current trends and future directions

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Previously, a decision to purchase computing equipment for nuclear medicine usually required evaluation of the 'local' needs. With the advent of Pacs and state of the art computer techniques for image acquisition and manipulation, purchase and subsequent application is to become much more complex. Some of the current trends and future possibilities which may influence the choice and operation of computers within and outside the nuclear medicine environment is discussed. (author)

  17. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    Science.gov (United States)

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  18. SAR: A fast computer for Camac data acquisition

    International Nuclear Information System (INIS)

    Bricaud, B.; Faivre, J.C.; Pain, J.

    1979-01-01

    This paper describes a special data acquisition and processing facility developed for Nuclear Physics experiments at intermediate energy installed at SATURNE (France) and at CERN (Geneva, Switzerland). Previously, we used a PDP 11/45 computer which was connected to the experiments through a Camac Branch highway. In a typical experiment (340 words per event), the computer limited the data acquisition rate at 4 μsec for each 16-bit transfer and the on-line data reduction at 20 events per second only. The initial goal of this project was to increase these two performances. Previous known acquisition processors were limited by the memory capacity these systems could support. Most of the time the data reduction was done on the host mini computer. Higher memory size can be designed with new fast RAM (Intel 2147) and the data processing can now take place on the front end processor

  19. iPod™ technology for teaching patients about anticoagulation: a pilot study of mobile computer-assisted patient education.

    Science.gov (United States)

    Denizard-Thompson, Nancy R; Singh, Sonal; Stevens, Sheila R; Miller, David P; Wofford, James L

    2012-01-01

    To determine whether an educational strategy using a handheld, multimedia computer (iPod™) is practical and sustainable for routine office-based patient educational tasks. With the limited amount of time allotted to the office encounter and the growing number of patient educational tasks, new strategies are needed to improve the efficiency of patient education. Education of patients anticoagulated with warfarin is considered critical to preventing complications. Despite the dangers associated with the use of warfarin, educational practices are variable and often haphazard. During a four-month period, we examined the implementation of a three-part series of iPod™-based patient educational modules delivered to anticoagulated patients at the time of routine INR (International Normalized Ratio) blood tests for outpatients on the anticoagulation registry at an urban community health center. A total of 141 computer module presentations were delivered to 91 patients during the four-month period. In all, 44 patients on the registry had no INR checkups, and thus no opportunity to view the modules, and 32 patients had at least three INR checkups but no modules were documented. Of the 130 patients with at least one INR performed during the study period, 22 (16.9%) patients completed all three modules, 91 (70.0%) patients received at least one module, and nine (7.6%) patients refused to view at least one module. Neither of the two handheld computers was lost or stolen, and no physician time was used in this routine educational activity. Patients reported that the audio and visual quality was very good, (9.0/10); the educational experience of the patient was helpful (7.4/10) compared with the patient's previous warfarin education (6.3/10), and the computer strategy extended the INR visit duration by 1-5 min at most. The computer-assisted patient educational strategy was well received by patients, and uptake of the intervention by the clinic was successful and durable. The i

  20. Interfacing external quantum devices to a universal quantum computer.

    Directory of Open Access Journals (Sweden)

    Antonio A Lagana

    Full Text Available We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer.

  1. Computer simulation studies in condensed-matter physics 5. Proceedings

    International Nuclear Information System (INIS)

    Landau, D.P.; Mon, K.K.; Schuettler, H.B.

    1993-01-01

    As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs

  2. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  3. Consolidation of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Di Girolamo, Alessandro; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall

    2016-01-01

    Throughout the first year of LHC Run 2, ATLAS Cloud Computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS Cloud Computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vac resources, streamlined usage of the High Level Trigger cloud for simulation and reconstruction, extreme scaling on Amazon EC2, and procurement of commercial cloud capacity in Europe. Building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems. ...

  4. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    systems that would fall under the Exascale rubric . In this chapter, we first discuss the attributes by which achievement of the label “Exascale” may be...Carrington, and E. Strohmaier. A Genetic Algorithms Approach to Modeling the Performance of Memory-bound Computations. Reno, NV, November 2007. ACM/IEEE... genetic stochasticity (random mating, mutation, etc). Outcomes are thus stochastic as well, and ecologists wish to ask questions like, “What is the

  5. Outcome of trial of scar in patients with previous caesarean section

    International Nuclear Information System (INIS)

    Khan, B.; Bashir, R.; Khan, W.

    2016-01-01

    Medical evidence indicates that 60-80% of women can achieve vaginal delivery after a previous lower segment caesarean section. Proper selection of patients for trial of scar and vigilant monitoring during labour will achieve successful maternal and perinatal outcome. The objective of our study is to establish the fact that vaginal delivery after one caesarean section has a high success rate in patients with previous one caesarean section for non-recurrent cause. Methods: The study was conducted in Ayub Teaching Abbottabad, Gynae-B Unit. All labouring patients, during the study period of five years, with previous one caesarean section and between 37 weeks to 41 weeks of gestation for a non-recurrent cause were included in the study. Data was recorded on special proforma designed for the purpose. Patients who had previous classical caesarean section, more than one caesarean section, and previous caesarean section with severe wound infection, transverse lie and placenta previa in present pregnancy were excluded. Foetal macrosomia (wt>4 kg) and severe IUGR with compromised blood flow on Doppler in present pregnancy were also not considered suitable for the study. Patients who had any absolute contraindication for vaginal delivery were also excluded. Results: There were 12505 deliveries during the study period. Total vaginal deliveries were 8790 and total caesarean sections were 3715. Caesarean section rate was 29.7%. Out of these 8790 patients, 764 patients were given a trial of scar and 535 patients delivered successfully vaginally (70%). Women who presented with spontaneous onset of labour were more likely to deliver vaginally (74.8%) as compared to induction group (27.1%). Conclusion: Trial of vaginal birth after caesarean (VBAC) in selected cases has great importance in the present era of the rising rate of primary caesarean section. (author)

  6. Sudden unexpected death in children with a previously diagnosed cardiovascular disorder

    NARCIS (Netherlands)

    Polderman, Florens N.; Cohen, Joeri; Blom, Nico A.; Delhaas, Tammo; Helbing, Wim A.; Lam, Jan; Sobotka-Plojhar, Marta A.; Temmerman, Arno M.; Sreeram, Narayanswani

    2004-01-01

    BACKGROUND: It is known that children with previously diagnosed heart defects die suddenly. The causes of death are often unknown. OBJECTIVE: The aim of the study was to identify all infants and children within the Netherlands with previously diagnosed heart disease who had a sudden unexpected death

  7. Sudden unexpected death in children with a previously diagnosed cardiovascular disorder

    NARCIS (Netherlands)

    Polderman, F.N.; Cohen, Joeri; Blom, N.A.; Delhaas, T.; Helbing, W.A.; Lam, J.; Sobotka-Plojhar, M.A.; Temmerman, Arno M.; Sreeram, N.

    2004-01-01

    Background: It is known that children with previously diagnosed heart defects die suddenly. The causes of death are often unknown. Objective: The aim of the study was to identify all infants and children within the Netherlands with previously diagnosed heart disease who had a sudden unexpected death

  8. Implications of Ubiquitous Computing for the Social Studies Curriculum

    Science.gov (United States)

    van Hover, Stephanie D.; Berson, Michael J.; Bolick, Cheryl Mason; Swan, Kathleen Owings

    2004-01-01

    In March 2002, members of the National Technology Leadership Initiative (NTLI) met in Charlottesville, Virginia to discuss the potential effects of ubiquitous computing on the field of education. Ubiquitous computing, or "on-demand availability of task-necessary computing power," involves providing every student with a handheld computer--a…

  9. Collaborative Dialogue in Synchronous Computer-Mediated Communication and Face-to-Face Communication

    Science.gov (United States)

    Zeng, Gang

    2017-01-01

    Previous research has documented that collaborative dialogue promotes L2 learning in both face-to-face (F2F) and synchronous computer-mediated communication (SCMC) modalities. However, relatively little research has explored modality effects on collaborative dialogue. Thus, motivated by sociocultual theory, this study examines how F2F compares…

  10. COMPARATIVE STUDY OF TERTIARY WASTEWATER TREATMENT BY COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Stefania Iordache

    2010-01-01

    Full Text Available The aim of this work is to asses conditions for implementation of a Biological Nutrient Removal (BNR process in theWastewater Treatment Plant (WWTP of Moreni city (Romania. In order to meet the more increased environmentalregulations, the wastewater treatment plant that was studied, must update the actual treatment process and have tomodernize it. A comparative study was undertaken of the quality of effluents that could be obtained by implementationof biological nutrient removal process like A2/O (Anaerobic/Anoxic/Oxic and VIP (Virginia Plant Initiative aswastewater tertiary treatments. In order to asses the efficiency of the proposed treatment schemata based on the datamonitored at the studied WWTP, it were realized computer models of biological nutrient removal configurations basedon A2/O and VIP process. Computer simulation was realized using a well-known simulator, BioWin by EnviroSimAssociates Ltd. The simulation process allowed to obtain some data that can be used in design of a tertiary treatmentstage at Moreni WWTP, in order to increase the efficiency in operation.

  11. Differences in directional sound source behavior and perception between assorted computer room models

    DEFF Research Database (Denmark)

    Vigeant, M. C.; Wang, L. M.; Rindel, Jens Holger

    2004-01-01

    time. However, for the three other parameters evaluated (sound-pressure level, clarity index, and lateral fraction), the changing diffusivity of the room does not diminish the importance of the directivity. The study therefore shows the importance of considering source directivity when using computer......Source directivity is an important input variable when using room acoustic computer modeling programs to generate auralizations. Previous research has shown that using a multichannel anechoic recording can produce a more natural sounding auralization, particularly as the number of channels...

  12. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... computer modeling used as a research method applied in the process ... conclusions discuss the benefits for students who analyzed the ... accounting education process the case study method should not .... providing travel safety information to passengers ... from literature readings with practical problems.

  13. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  14. Computer Assisted Instruction in Special Education Three Case Studies

    OpenAIRE

    İbrahim DOĞAN; Ömür AKDEMİR

    2015-01-01

    The purpose of this study is to investigate the computer use of three students attending the special education center. Students have mental retardation, hearing problem and physical handicap respectively. The maximum variation sampling is used to select the type of handicap while the convenience sampling is used to select the participants. Three widely encountered handicap types in special education are chosen to select the study participants. The multiple holistic case study design is used i...

  15. Study on computer-aided simulation procedure for multicomponent separating cascade

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro

    1982-11-01

    The present report reviews the author's study on the computer-aided simulation procedure for a multicomponent separating cascade. As a conclusion, two very powerful simulation procedures have been developed for cascades composed of separating elements whose separation factors are very large. They are applicable in cases where interstage flow rates are input variables for the calculation and stage separation factors are given either as constants or as functions of compositions of the up and down streams. As an application of the new procedure, a computer-aided simulation study has been performed for hydrogen isotope separating cascades by porous membrane method. A cascade system configuration is developed and pertinent design specifications are determined in an example case of the feed conditions and separation requirements. (author)

  16. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  17. Computed tomographic study of 50 patients with hypodense hepatic injuries in childhood; Estudo de 50 casos por tomografia computadorizada de lesoes hipodensas hepaticas fundamentais na infancia

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Ines Minniti Rodrigues; Alvares, Beatriz Regina; Baracat, Jamal; Martins, Daniel Lahan [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Ciencias Medicas. Dept. de Radiologia]. E-mail: iminniti@fcm.unicamp.br; Pereira, Ricardo Minniti Rodrigues [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Ciencias Medicas

    2006-03-15

    Objective: To describe the different tomographic findings in hypodense hepatic lesions in children and its differential diagnosis. Materials and methods: computed tomographic studies were obtained from 50 patients (age range: 0-16 years) with low-density liver lesions previously diagnosed by ultrasound. Images were made before and after administration of intravenous contrast medium. Image findings were analyzed and afterwards correlated with anatomopathological diagnosis. Results: forty-seven of 50 cases were confirmed, 30 by anatomopathological diagnosis. Most of then were benign lesions, hemangioma in 20%. Such lesions presented a homogeneous contrast absorption, mainly at the delayed phase, differing from malignant lesions. Metastasis was the most frequently found malignant lesion (18%). Conclusion: computed tomographic study is of great value in complementing the diagnosis of hypodense hepatic lesions in children, and must follow ultrasound diagnosis as a routine procedure. (author)

  18. Chest X ray effective doses estimation in computed radiography

    International Nuclear Information System (INIS)

    Abdalla, Esra Abdalrhman Dfaalla

    2013-06-01

    Conventional chest radiography is technically difficult because of wide in tissue attenuations in the chest and limitations of screen-film systems. Computed radiography (CR) offers a different approach utilizing a photostimulable phosphor. photostimulable phosphors overcome some image quality limitations of chest imaging. The objective of this study was to estimate the effective dose in computed radiography at three hospitals in Khartoum. This study has been conducted in radiography departments in three centres Advanced Diagnostic Center, Nilain Diagnostic Center, Modern Diagnostic Center. The entrance surface dose (ESD) measurement was conducted for quality control of x-ray machines and survey of operators experimental techniques. The ESDs were measured by UNFORS dosimeter and mathematical equations to estimate patient doses during chest X rays. A total of 120 patients were examined in three centres, among them 62 were males and 58 were females. The overall mean and range of patient dosed was 0.073±0.037 (0.014-0.16) mGy per procedure while the effective dose was 3.4±01.7 (0.6-7.0) mSv per procedure. This study compared radiation doses to patients radiographic examinations of chest using computed radiology. The radiation dose was measured in three centres in Khartoum- Sudan. The results of the measured effective dose showed that the dose in chest radiography was lower in computed radiography compared to previous studies.(Author)

  19. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    Science.gov (United States)

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  20. Comparative study between computed radiography and conventional radiography

    International Nuclear Information System (INIS)

    Noorhazleena Azaman; Khairul Anuar Mohd Salleh; Sapizah Rahim; Shaharudin Sayuti; Arshad Yassin; Abdul Razak Hamzah

    2010-01-01

    In Industrial Radiography, there are many criteria that need to be considered based on established standards to accept or reject the radiographic film. For conventional radiography, we need to consider the optical density by using the densitometer when viewing the film on the viewer. But in the computed radiography (CR) we need to evaluate and performed the analysis from the quality of the digital image through grey value. There are many factors that affected the digital image quality. One of the factors which are affected to the digital image quality in the image processing is grey value that related to the contrast resolution. In this work, we performed grey value study measurement on digital radiography systems and compared it with exposed films in conventional radiography. The test sample is a steel step wedge. We found out the contrast resolution is higher in Computed Radiography compared with Conventional Radiography. (author)

  1. Computer literacy among first year medical students in a developing country: A cross sectional study

    Science.gov (United States)

    2012-01-01

    Background The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p computer training was the strongest predictor of computer literacy (β = 13.034), followed by using

  2. Finding New Math Identities by Computer

    Science.gov (United States)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Recently a number of interesting new mathematical identities have been discovered by means of numerical searches on high performance computers, using some newly discovered algorithms. These include the following: pi = ((sup oo)(sub k=0))(Sigma) (1 / 16) (sup k) ((4 / 8k+1) - (2 / 8k+4) - (1 / 8k+5) - (1 / 8k+6)) and ((17 pi(exp 4)) / 360) = ((sup oo)(sub k=1))(Sigma) (1 + (1/2) + (1/3) + ... + (1/k))(exp 2) k(exp -2), zeta(3, 1, 3, 1, ..., 3, 1) = (2 pi(exp 4m) / (4m+2)! where m = number of (3,1) pairs. and where zeta(n1,n2,...,nr) = (sub k1 (is greater than) k2 (is greater than) ... (is greater than) kr)(Sigma) (1 / (k1 (sup n1) k2 (sup n2) ... kr (sup nr). The first identity is remarkable in that it permits one to compute the n-th binary or hexadecimal digit of pu directly, without computing any of the previous digits, and without using multiple precision arithmetic. Recently the ten billionth hexadecimal digit of pi was computed using this formula. The third identity has connections to quantum field theory. (The first and second of these been formally established; the third is affirmed by numerical evidence only.) The background and results of this work will be described, including an overview of the algorithms and computer techniques used in these studies.

  3. Everolimus for Previously Treated Advanced Gastric Cancer: Results of the Randomized, Double-Blind, Phase III GRANITE-1 Study

    Science.gov (United States)

    Ohtsu, Atsushi; Ajani, Jaffer A.; Bai, Yu-Xian; Bang, Yung-Jue; Chung, Hyun-Cheol; Pan, Hong-Ming; Sahmoud, Tarek; Shen, Lin; Yeh, Kun-Huei; Chin, Keisho; Muro, Kei; Kim, Yeul Hong; Ferry, David; Tebbutt, Niall C.; Al-Batran, Salah-Eddin; Smith, Heind; Costantini, Chiara; Rizvi, Syed; Lebwohl, David; Van Cutsem, Eric

    2013-01-01

    Purpose The oral mammalian target of rapamycin inhibitor everolimus demonstrated promising efficacy in a phase II study of pretreated advanced gastric cancer. This international, double-blind, phase III study compared everolimus efficacy and safety with that of best supportive care (BSC) in previously treated advanced gastric cancer. Patients and Methods Patients with advanced gastric cancer that progressed after one or two lines of systemic chemotherapy were randomly assigned to everolimus 10 mg/d (assignment schedule: 2:1) or matching placebo, both given with BSC. Randomization was stratified by previous chemotherapy lines (one v two) and region (Asia v rest of the world [ROW]). Treatment continued until disease progression or intolerable toxicity. Primary end point was overall survival (OS). Secondary end points included progression-free survival (PFS), overall response rate, and safety. Results Six hundred fifty-six patients (median age, 62.0 years; 73.6% male) were enrolled. Median OS was 5.4 months with everolimus and 4.3 months with placebo (hazard ratio, 0.90; 95% CI, 0.75 to 1.08; P = .124). Median PFS was 1.7 months and 1.4 months in the everolimus and placebo arms, respectively (hazard ratio, 0.66; 95% CI, 0.56 to 0.78). Common grade 3/4 adverse events included anemia, decreased appetite, and fatigue. The safety profile was similar in patients enrolled in Asia versus ROW. Conclusion Compared with BSC, everolimus did not significantly improve overall survival for advanced gastric cancer that progressed after one or two lines of previous systemic chemotherapy. The safety profile observed for everolimus was consistent with that observed for everolimus in other cancers. PMID:24043745

  4. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  5. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  6. Computer-Assisted Instruction: A Case Study of Two Charter Schools

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2013-01-01

    The purpose of this study was to examine the relationship in achievement gap between English language learners (ELLs) utilizing computer-assisted instruction (CAI) in the classroom, and ELLs relying solely on traditional classroom instruction. The study findings showed that students using CAI to supplement traditional lectures performed better…

  7. Estimation of aortic time-enhancement curve in pharmacokinetic analysis. Dynamic study by multi-detector row computed tomography

    International Nuclear Information System (INIS)

    Yamaguchi, Isao; Kidoya, Eiji; Higashimura, Kyoji; Hayashi, Hiroyuki; Suzuki, Masayuki

    2007-01-01

    This paper presents an introduction to the development of software that provides a physiologic model of contrast medium enhancement by incorporating available physiologic data and contrast medium pharmacokinetics to predict an organ-specific aortic time-enhancement curve (TEC) in computed tomography (CT) with various contrast medium injection protocols in patients of various heights, weights, cardiac output levels, and so on. The physiologic model of contrast medium enhancement was composed of six compartments for early contrast enhancement pharmacokinetics. Contrast medium is injected via the antecubital vein and distributed to the right side of the heart, the pulmonary compartment, the left side of the heart, and the aorta. It then circulates back to the right side of the heart via the systemic circulation. A computer-based, compartmental model of the aortic system was generated using human physiologic parameters and six differential equations to describe the transport of contrast medium. Aortic TEC generated by the computer-based physiologic model of contrast medium enhancement showed validity and agreement with clinical data and findings published previously. A computer-based physiologic model that may help predict organ-specific CT contrast medium enhancement for different injection protocols was developed. Such a physiologic model may have multiple clinical applications. (author)

  8. Response to health insurance by previously uninsured rural children.

    Science.gov (United States)

    Tilford, J M; Robbins, J M; Shema, S J; Farmer, F L

    1999-08-01

    To examine the healthcare utilization and costs of previously uninsured rural children. Four years of claims data from a school-based health insurance program located in the Mississippi Delta. All children who were not Medicaid-eligible or were uninsured, were eligible for limited benefits under the program. The 1987 National Medical Expenditure Survey (NMES) was used to compare utilization of services. The study represents a natural experiment in the provision of insurance benefits to a previously uninsured population. Premiums for the claims cost were set with little or no information on expected use of services. Claims from the insurer were used to form a panel data set. Mixed model logistic and linear regressions were estimated to determine the response to insurance for several categories of health services. The use of services increased over time and approached the level of utilization in the NMES. Conditional medical expenditures also increased over time. Actuarial estimates of claims cost greatly exceeded actual claims cost. The provision of a limited medical, dental, and optical benefit package cost approximately $20-$24 per member per month in claims paid. An important uncertainty in providing health insurance to previously uninsured populations is whether a pent-up demand exists for health services. Evidence of a pent-up demand for medical services was not supported in this study of rural school-age children. States considering partnerships with private insurers to implement the State Children's Health Insurance Program could lower premium costs by assembling basic data on previously uninsured children.

  9. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    Science.gov (United States)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  10. Can Designing Self-Representations through Creative Computing Promote an Incremental View of Intelligence and Enhance Creativity among At-Risk Youth?

    Science.gov (United States)

    Blau, Ina; Benolol, Nurit

    2016-01-01

    Creative computing is one of the rapidly growing educational trends around the world. Previous studies have shown that creative computing can empower disadvantaged children and youth. At-risk youth tend to hold a negative view of self and perceive their abilities as inferior compared to "normative" pupils. The Implicit Theories of…

  11. [Computed tomography with computer-assisted detection of pulmonary nodules in dogs and cats].

    Science.gov (United States)

    Niesterok, C; Piesnack, S; Köhler, C; Ludewig, E; Alef, M; Kiefer, I

    2015-01-01

    The aim of this study was to assess the potential benefit of computer-assisted detection (CAD) of pulmonary nodules in veterinary medicine. Therefore, the CAD rate was compared to the detection rates of two individual examiners in terms of its sensitivity and false-positive findings. We included 51 dogs and 16 cats with pulmonary nodules previously diagnosed by computed tomography. First, the number of nodules ≥ 3 mm was recorded for each patient by two independent examiners. Subsequently, each examiner used the CAD software for automated nodule detection. With the knowledge of the CAD results, a final consensus decision on the number of nodules was achieved. The software used was a commercially available CAD program. The sensitivity of examiner 1 was 89.2%, while that of examiner 2 reached 87.4%. CAD had a sensitivity of 69.4%. With CAD, the sensitivity of examiner 1 increased to 94.7% and that of examiner 2 to 90.8%. The CAD-system, which we used in our study, had a moderate sensitivity of 69.4%. Despite its severe limitations, with a high level of false-positive and false-negative results, CAD increased the examiners' sensitivity. Therefore, its supportive role in diagnostics appears to be evident.

  12. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  13. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study

    OpenAIRE

    Ellis, Mary Kay

    1993-01-01

    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...

  14. Autumn Algorithm-Computation of Hybridization Networks for Realistic Phylogenetic Trees.

    Science.gov (United States)

    Huson, Daniel H; Linz, Simone

    2018-01-01

    A minimum hybridization network is a rooted phylogenetic network that displays two given rooted phylogenetic trees using a minimum number of reticulations. Previous mathematical work on their calculation has usually assumed the input trees to be bifurcating, correctly rooted, or that they both contain the same taxa. These assumptions do not hold in biological studies and "realistic" trees have multifurcations, are difficult to root, and rarely contain the same taxa. We present a new algorithm for computing minimum hybridization networks for a given pair of "realistic" rooted phylogenetic trees. We also describe how the algorithm might be used to improve the rooting of the input trees. We introduce the concept of "autumn trees", a nice framework for the formulation of algorithms based on the mathematics of "maximum acyclic agreement forests". While the main computational problem is hard, the run-time depends mainly on how different the given input trees are. In biological studies, where the trees are reasonably similar, our parallel implementation performs well in practice. The algorithm is available in our open source program Dendroscope 3, providing a platform for biologists to explore rooted phylogenetic networks. We demonstrate the utility of the algorithm using several previously studied data sets.

  15. Steam generator transient studies using a simplified two-fluid computer code

    International Nuclear Information System (INIS)

    Munshi, P.; Bhatnagar, R.; Ram, K.S.

    1985-01-01

    A simplified two-fluid computer code has been used to simulate reactor-side (or primary-side) transients in a PWR steam generator. The disturbances are modelled as ramp inputs for pressure, internal energy and mass flow-rate for the primary fluid. The CPU time for a transient duration of 4 s is approx. 10 min on a DEC-1090 computer system. The results are thermodynamically consistent and encouraging for further studies. (author)

  16. A detailed experimental study of a DNA computer with two endonucleases.

    Science.gov (United States)

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  17. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  18. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  19. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  20. In-cylinder diesel spray combustion simulations using parallel computation: A performance benchmarking study

    International Nuclear Information System (INIS)

    Pang, Kar Mun; Ng, Hoon Kiat; Gan, Suyin

    2012-01-01

    Highlights: ► A performance benchmarking exercise is conducted for diesel combustion simulations. ► The reduced chemical mechanism shows its advantages over base and skeletal models. ► High efficiency and great reduction of CPU runtime are achieved through 4-node solver. ► Increasing ISAT memory from 0.1 to 2 GB reduces the CPU runtime by almost 35%. ► Combustion and soot processes are predicted well with minimal computational cost. - Abstract: In the present study, in-cylinder diesel combustion simulation was performed with parallel processing on an Intel Xeon Quad-Core platform to allow both fluid dynamics and chemical kinetics of the surrogate diesel fuel model to be solved simultaneously on multiple processors. Here, Cartesian Z-Coordinate was selected as the most appropriate partitioning algorithm since it computationally bisects the domain such that the dynamic load associated with fuel particle tracking was evenly distributed during parallel computations. Other variables examined included number of compute nodes, chemistry sizes and in situ adaptive tabulation (ISAT) parameters. Based on the performance benchmarking test conducted, parallel configuration of 4-compute node was found to reduce the computational runtime most efficiently whereby a parallel efficiency of up to 75.4% was achieved. The simulation results also indicated that accuracy level was insensitive to the number of partitions or the partitioning algorithms. The effect of reducing the number of species on computational runtime was observed to be more significant than reducing the number of reactions. Besides, the study showed that an increase in the ISAT maximum storage of up to 2 GB reduced the computational runtime by 50%. Also, the ISAT error tolerance of 10 −3 was chosen to strike a balance between results accuracy and computational runtime. The optimised parameters in parallel processing and ISAT, as well as the use of the in-house reduced chemistry model allowed accurate

  1. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  2. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  3. Computed tomography scanner applied to soil compaction studies

    International Nuclear Information System (INIS)

    Vaz, C.M.P.

    1989-11-01

    The soil compaction problem was studied using a first generation computed tomography scanner (CT). This apparatus gets images of soil cross sections samples, with resolution of a few millimeters. We performed the following laboratory and field experiments: basic experiments of equipment calibrations and resolutions studies; measurements of compacted soil thin layers; measurements of soil compaction caused by agricultural tools; stress-strain modelling in confined soil sample, with several moisture degree; characterizations of soil bulk density profile with samples collected in a hole (trench), comparing with a cone penetrometer technique. (author)

  4. Monte Carlo simulations of adult and pediatric computed tomography exams: Validation studies of organ doses with physical phantoms

    International Nuclear Information System (INIS)

    Long, Daniel J.; Lee, Choonsik; Tien, Christopher; Fisher, Ryan; Hoerner, Matthew R.; Hintenlang, David; Bolch, Wesley E.

    2013-01-01

    Purpose: To validate the accuracy of a Monte Carlo source model of the Siemens SOMATOM Sensation 16 CT scanner using organ doses measured in physical anthropomorphic phantoms. Methods: The x-ray output of the Siemens SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code, MCNPX version 2.6. The resulting source model was able to perform various simulated axial and helical computed tomographic (CT) scans of varying scan parameters, including beam energy, filtration, pitch, and beam collimation. Two custom-built anthropomorphic phantoms were used to take dose measurements on the CT scanner: an adult male and a 9-month-old. The adult male is a physical replica of University of Florida reference adult male hybrid computational phantom, while the 9-month-old is a replica of University of Florida Series B 9-month-old voxel computational phantom. Each phantom underwent a series of axial and helical CT scans, during which organ doses were measured using fiber-optic coupled plastic scintillator dosimeters developed at University of Florida. The physical setup was reproduced and simulated in MCNPX using the CT source model and the computational phantoms upon which the anthropomorphic phantoms were constructed. Average organ doses were then calculated based upon these MCNPX results. Results: For all CT scans, good agreement was seen between measured and simulated organ doses. For the adult male, the percent differences were within 16% for axial scans, and within 18% for helical scans. For the 9-month-old, the percent differences were all within 15% for both the axial and helical scans. These results are comparable to previously published validation studies using GE scanners and commercially available anthropomorphic phantoms. Conclusions: Overall results of this study show that the Monte Carlo source model can be used to accurately and reliably calculate organ doses for patients undergoing a variety of axial or helical CT

  5. A comparative study of three-dimensional reconstructive images of temporomandibular joint using computed tomogram

    International Nuclear Information System (INIS)

    Lim, Suk Young; Koh, Kwang Joon

    1993-01-01

    The purpose of this study was to clarify the spatial relationship of temporomandibular joint and to an aid in the diagnosis of temporomandibular disorder. For this study, three-dimensional images of normal temporomandibular joint were reconstructed by computer image analysis system and three-dimensional reconstructive program integrated in computed tomography. The obtained results were as follows : 1. Two-dimensional computed tomograms had the better resolution than three dimensional computed tomograms in the evaluation of bone structure and the disk of TMJ. 2. Direct sagittal computed tomograms and coronal computed tomograms had the better resolution in the evaluation of the disk of TMJ. 3. The positional relationship of the disk could be visualized, but the configuration of the disk could not be clearly visualized on three-dimensional reconstructive CT images. 4. Three-dimensional reconstructive CT images had the smoother margin than three-dimensional images reconstructed by computer image analysis system, but the images of the latter had the better perspective. 5. Three-dimensional reconstructive images had the better spatial relationship of the TMJ articulation, and the joint space were more clearly visualized on dissection images.

  6. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    Science.gov (United States)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    2005-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25 percent of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  7. Computer literacy and attitudes towards e-learning among first year medical students.

    Science.gov (United States)

    Link, Thomas Michael; Marz, Richard

    2006-06-19

    At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.

  8. Positron emission computed tomography

    International Nuclear Information System (INIS)

    Grover, M.; Schelbert, H.R.

    1985-01-01

    Regional mycardial blood flow and substrate metabolism can be non-invasively evaluated and quantified with positron emission computed tomography (Positron-CT). Tracers of exogenous glucose utilization and fatty acid metabolism are available and have been extensively tested. Specific tracer kinetic models have been developed or are being tested so that glucose and fatty acid metabolism can be measured quantitatively by Positron-CT. Tracers of amino acid and oxygen metabolism are utilized in Positron-CT studies of the brain and development of such tracers for cardiac studies are in progress. Methods to quantify regional myocardial blood flow are also being developed. Previous studies have demonstrated the ability of Positron-/CT to document myocardial infarction. Experimental and clinical studies have begun to identify metabolic markers of reversibly ischemic myocardium. The potential of Positron-CT to reliably detect potentially salvageable myocardium and, hence, to identify appropriate therapeutic interventions is one of the most exciting applications of the technique

  9. The traveling salesman problem a computational study

    CERN Document Server

    Applegate, David L; Chvatal, Vasek; Cook, William J

    2006-01-01

    This book presents the latest findings on one of the most intensely investigated subjects in computational mathematics--the traveling salesman problem. It sounds simple enough: given a set of cities and the cost of travel between each pair of them, the problem challenges you to find the cheapest route by which to visit all the cities and return home to where you began. Though seemingly modest, this exercise has inspired studies by mathematicians, chemists, and physicists. Teachers use it in the classroom. It has practical applications in genetics, telecommunications, and neuroscience.

  10. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  11. Learning Universal Computations with Spikes

    Science.gov (United States)

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  12. Morphological Computation: Synergy of Body and Brain

    Directory of Open Access Journals (Sweden)

    Keyan Ghazi-Zahedi

    2017-08-01

    Full Text Available There are numerous examples that show how the exploitation of the body’s physical properties can lift the burden of the brain. Examples include grasping, swimming, locomotion, and motion detection. The term Morphological Computation was originally coined to describe processes in the body that would otherwise have to be conducted by the brain. In this paper, we argue for a synergistic perspective, and by that we mean that Morphological Computation is a process which requires a close interaction of body and brain. Based on a model of the sensorimotor loop, we study a new measure of synergistic information and show that it is more reliable in cases in which there is no synergistic information, compared to previous results. Furthermore, we discuss an algorithm that allows the calculation of the measure in non-trivial (non-binary systems.

  13. Experimental and Computational Study of Ductile Fracture in Small Punch Tests

    Directory of Open Access Journals (Sweden)

    Betül Gülçimen Çakan

    2017-10-01

    Full Text Available A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  14. Experimental and Computational Study of Ductile Fracture in Small Punch Tests.

    Science.gov (United States)

    Gülçimen Çakan, Betül; Soyarslan, Celal; Bargmann, Swantje; Hähner, Peter

    2017-10-17

    A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  15. Logic as Marr's Computational Level: Four Case Studies.

    Science.gov (United States)

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  16. Wrist Hypothermia Related to Continuous Work with a Computer Mouse: A Digital Infrared Imaging Pilot Study

    Directory of Open Access Journals (Sweden)

    Jelena Reste

    2015-08-01

    Full Text Available Computer work is characterized by sedentary static workload with low-intensity energy metabolism. The aim of our study was to evaluate the dynamics of skin surface temperature in the hand during prolonged computer mouse work under different ergonomic setups. Digital infrared imaging of the right forearm and wrist was performed during three hours of continuous computer work (measured at the start and every 15 minutes thereafter in a laboratory with controlled ambient conditions. Four people participated in the study. Three different ergonomic computer mouse setups were tested on three different days (horizontal computer mouse without mouse pad; horizontal computer mouse with mouse pad and padded wrist support; vertical computer mouse without mouse pad. The study revealed a significantly strong negative correlation between the temperature of the dorsal surface of the wrist and time spent working with a computer mouse. Hand skin temperature decreased markedly after one hour of continuous computer mouse work. Vertical computer mouse work preserved more stable and higher temperatures of the wrist (>30 °C, while continuous use of a horizontal mouse for more than two hours caused an extremely low temperature (<28 °C in distal parts of the hand. The preliminary observational findings indicate the significant effect of the duration and ergonomics of computer mouse work on the development of hand hypothermia.

  17. Computer literacy among first year medical students in a developing country: A cross sectional study

    Directory of Open Access Journals (Sweden)

    Ranasinghe Priyanga

    2012-09-01

    Full Text Available Abstract Background The use of computer assisted learning (CAL has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190 were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%, 49.7% were Males. Majority of the students (77.3% owned a computer (Males-74.4%, Females-80.2%. Students have gained their present computer knowledge by; a formal training programme (64.1%, self learning (63.0% or by peer learning (49.2%. The students used computers for predominately; word processing (95.6%, entertainment (95.0%, web browsing (80.1% and preparing presentations (76.8%. Majority of the students (75.7% expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6. There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p Conclusion Sri Lankan medical undergraduates had a low-intermediate level of computer

  18. Limits of computational white-light holography

    International Nuclear Information System (INIS)

    Mader, Sebastian; Kozacki, Tomasz; Tompkin, Wayne

    2013-01-01

    Recently, computational holograms are being used in applications, where previously conventional holograms were applied. Compared to conventional holography, computational holography is based on imaging of virtual objects instead of real objects, which renders them somewhat more flexibility. Here, computational holograms are calculated based on the superposition of point sources, which are placed at the mesh vertices of arbitrary 3D models. The computed holograms have full parallax and exhibit a problem in viewing that we have called g hosting , which is linked to the viewing of computational holograms based on 3D models close to the image plane. Experimental white-light reconstruction of these holograms showed significant blurring, which is explained here based on simulations of the lateral as well as the axial resolution of a point image with respect to the source spectrum and image distance. In accordance with these simulations, an upper limit of the distance to the image plane is determined, which ensures high quality imaging.

  19. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    Science.gov (United States)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  20. BOINC service for volunteer cloud computing

    International Nuclear Information System (INIS)

    Høimyr, N; Blomer, J; Buncic, P; Giovannozzi, M; Gonzalez, A; Harutyunyan, A; Jones, P L; Karneyeu, A; Marquina, M A; Mcintosh, E; Segal, B; Skands, P; Grey, F; Lombraña González, D; Zacharov, I

    2012-01-01

    Since a couple of years, a team at CERN and partners from the Citizen Cyberscience Centre (CCC) have been working on a project that enables general physics simulation programs to run in a virtual machine on volunteer PCs around the world. The project uses the Berkeley Open Infrastructure for Network Computing (BOINC) framework. Based on CERNVM and the job management framework Co-Pilot, this project was made available for public beta-testing in August 2011 with Monte Carlo simulations of LHC physics under the name “LHC at home 2.0” and the BOINC project: “Test4Theory”. At the same time, CERN's efforts on Volunteer Computing for LHC machine studies have been intensified; this project has previously been known as LHC at home, and has been running the “Sixtrack” beam dynamics application for the LHC accelerator, using a classic BOINC framework without virtual machines. CERN-IT has set up a BOINC server cluster, and has provided and supported the BOINC infrastructure for both projects. CERN intends to evolve the setup into a generic BOINC application service that will allow scientists and engineers at CERN to profit from volunteer computing. This paper describes the experience with the two different approaches to volunteer computing as well as the status and outlook of a general BOINC service.

  1. Management of Virtual Machine as an Energy Conservation in Private Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Fauzi Akhmad

    2016-01-01

    Full Text Available Cloud computing is a service model that is packaged in a base computing resources that can be accessed through the Internet on demand and placed in the data center. Data center architecture in cloud computing environments are heterogeneous and distributed, composed of a cluster of network servers with different capacity computing resources in different physical servers. The problems on the demand and availability of cloud services can be solved by fluctuating data center cloud through abstraction with virtualization technology. Virtual machine (VM is a representation of the availability of computing resources that can be dynamically allocated and reallocated on demand. In this study the consolidation of VM as energy conservation in Private Cloud Computing Systems with the target of process optimization selection policy and migration of the VM on the procedure consolidation. VM environment cloud data center to consider hosting a type of service a particular application at the instance VM requires a different level of computing resources. The results of the use of computing resources on a VM that is not balanced in physical servers can be reduced by using a live VM migration to achieve workload balancing. A practical approach used in developing OpenStack-based cloud computing environment by integrating Cloud VM and VM Placement selection procedure using OpenStack Neat VM consolidation. Following the value of CPU Time used as a fill to get the average value in MHz CPU utilization within a specific time period. The average value of a VM’s CPU utilization in getting from the current CPU_time reduced by CPU_time from the previous data retrieval multiplied by the maximum frequency of the CPU. The calculation result is divided by the making time CPU_time when it is reduced to the previous taking time CPU_time multiplied by milliseconds.

  2. Handheld computers for self-administered sensitive data collection: A comparative study in Peru

    Directory of Open Access Journals (Sweden)

    Hughes James P

    2008-03-01

    Full Text Available Abstract Background Low-cost handheld computers (PDA potentially represent an efficient tool for collecting sensitive data in surveys. The goal of this study is to evaluate the quality of sexual behavior data collected with handheld computers in comparison with paper-based questionnaires. Methods A PDA-based program for data collection was developed using Open-Source tools. In two cross-sectional studies, we compared data concerning sexual behavior collected with paper forms to data collected with PDA-based forms in Ancon (Lima. Results The first study enrolled 200 participants (18–29 years. General agreement between data collected with paper format and handheld computers was 86%. Categorical variables agreement was between 70.5% and 98.5% (Kappa: 0.43–0.86 while numeric variables agreement was between 57.1% and 79.8% (Spearman: 0.76–0.95. Agreement and correlation were higher in those who had completed at least high school than those with less education. The second study enrolled 198 participants. Rates of responses to sensitive questions were similar between both kinds of questionnaires. However, the number of inconsistencies (p = 0.0001 and missing values (p = 0.001 were significantly higher in paper questionnaires. Conclusion This study showed the value of the use of handheld computers for collecting sensitive data, since a high level of agreement between paper and PDA responses was reached. In addition, a lower number of inconsistencies and missing values were found with the PDA-based system. This study has demonstrated that it is feasible to develop a low-cost application for handheld computers, and that PDAs are feasible alternatives for collecting field data in a developing country.

  3. Computing in Hydraulic Engineering Education

    Science.gov (United States)

    Duan, J. G.

    2011-12-01

    Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.

  4. Heavy Lift Vehicle (HLV) Avionics Flight Computing Architecture Study

    Science.gov (United States)

    Hodson, Robert F.; Chen, Yuan; Morgan, Dwayne R.; Butler, A. Marc; Sdhuh, Joseph M.; Petelle, Jennifer K.; Gwaltney, David A.; Coe, Lisa D.; Koelbl, Terry G.; Nguyen, Hai D.

    2011-01-01

    A NASA multi-Center study team was assembled from LaRC, MSFC, KSC, JSC and WFF to examine potential flight computing architectures for a Heavy Lift Vehicle (HLV) to better understand avionics drivers. The study examined Design Reference Missions (DRMs) and vehicle requirements that could impact the vehicles avionics. The study considered multiple self-checking and voting architectural variants and examined reliability, fault-tolerance, mass, power, and redundancy management impacts. Furthermore, a goal of the study was to develop the skills and tools needed to rapidly assess additional architectures should requirements or assumptions change.

  5. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    Science.gov (United States)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  6. Computational study of NMDA conductance and cortical oscillations in schizophrenia

    Directory of Open Access Journals (Sweden)

    Kubra eKomek Kirli

    2014-10-01

    Full Text Available N-methyl-D-aspartate (NMDA receptor hypofunction has been implicated in the pathophysiology of schizophrenia. The illness is also characterized by gamma oscillatory disturbances, which can be evaluated with precise frequency specificity employing auditory cortical entrainment paradigms. This computational study investigates how synaptic NMDA hypofunction may give rise to network level oscillatory deficits as indexed by entrainment paradigms. We developed a computational model of a local cortical circuit with pyramidal cells and fast-spiking interneurons (FSI, incorporating NMDA, α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic (AMPA, and γ-aminobutyric acid (GABA synaptic kinetics. We evaluated the effects of varying NMDA conductance on FSIs and pyramidal cells, as well as AMPA to NMDA ratio. We also examined the differential effects across a broad range of entrainment frequencies as a function of NMDA conductance. Varying NMDA conductance onto FSIs revealed an inverted-U relation with network gamma whereas NMDA conductance onto the pyramidal cells had a more monotonic relationship. Varying NMDA vs. AMPA conductance onto FSIs demonstrated the necessity of AMPA in the generation of gamma while NMDA receptors had a modulatory role. Finally, reducing NMDA conductance onto FSI and varying the stimulus input frequency reproduced the specific reductions in gamma range (~40 Hz as observed in schizophrenia studies. Our computational study showed that reductions in NMDA conductance onto FSIs can reproduce similar disturbances in entrainment to periodic stimuli within the gamma range as reported in schizophrenia studies. These findings provide a mechanistic account of how specific cellular level disturbances can give rise to circuitry level pathophysiologic disturbance in schizophrenia.

  7. Study on the application of mobile internet cloud computing platform

    Science.gov (United States)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  8. Computational modelling of expressive music performance in hexaphonic guitar

    OpenAIRE

    Siquier, Marc

    2017-01-01

    Computational modelling of expressive music performance has been widely studied in the past. While previous work in this area has been mainly focused on classical piano music, there has been very little work on guitar music, and such work has focused on monophonic guitar playing. In this work, we present a machine learning approach to automatically generate expressive performances from non expressive music scores for polyphonic guitar. We treated guitar as an hexaphonic instrument, obtaining ...

  9. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Science.gov (United States)

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  10. Computational Studies of Ionic Liquids

    National Research Council Canada - National Science Library

    Boatz, Jerry

    2004-01-01

    The structures and relative energies of the six possible N-protonated structures of the 1,5-diamino-1,2,3,4-tetrazolium cation have been computed at the B3LYP(3)/6-311G(d,p) and MP2/6-311G(d,p) levels of theory...

  11. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  12. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972)

  13. Advances in computer applications in radioactive tracer studies of the circulation

    International Nuclear Information System (INIS)

    Wagner, H.N. Jr.; Klingensmith, W.C. III; Knowles, L.G.; Lotter, M.G.; Natarajan, T.K.

    1977-01-01

    Advances in computer technology since the last IAEA symposium on medical radionuclide imaging have now made possible a new approach to the study of physiological processes that promise to improve greatly our perception of body functions and structures. We have developed procedures, called ''compressed time imaging'' (CTI), that display serial images obtained over periods of minutes and hours at framing rates of approximately 16 to 60 per minute. At other times, ''real'' or ''expanded time imaging'' is used, depending on the process under study. Designed initially to study the beating heart, such multidimensional time studies are now being extended to the cerebral and other regional circulations, as well as to other organ systems. The improved imaging methods provide a new approach to space and time in the study of physiology and are supplemented by quantitative analysis of data displayed on the television screen of the computer. (author)

  14. The rheology of concentrated dispersions: structure changes and shear thickening in experiments and computer simulations

    NARCIS (Netherlands)

    Boersma, W.H.; Laven, J.; Stein, H.N.; Moldenaers, P.; Keunings, R.

    1992-01-01

    The flow-induced changes in the microstructure and rheol. of very concd., shear thickening dispersions are studied. Results obtained for polystyrene sphere dispersions are compared with previous data and computer simulations to give better insight into the processes occurring in the dispersions. [on

  15. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  16. Effectiveness of Computer-Assisted Mathematics Education (CAME) over Academic Achievement: A Meta-Analysis Study

    Science.gov (United States)

    Demir, Seda; Basol, Gülsah

    2014-01-01

    The aim of the current study is to determine the overall effects of Computer-Assisted Mathematics Education (CAME) on academic achievement. After an extensive review of the literature, studies using Turkish samples and observing the effects of Computer-Assisted Education (CAE) on mathematics achievement were examined. As a result of this…

  17. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  18. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  19. A meta-analysis of drug resistant tuberculosis in Sub-Saharan Africa: how strongly associated with previous treatment and HIV co-infection?

    Science.gov (United States)

    Berhan, Asres; Berhan, Yifru; Yizengaw, Desalegn

    2013-11-01

    In Sub-Saharan Africa, the fight against tuberculosis (TB) has encountered a great challenge because of the emergence of drug resistant TB strains and the high prevalence of HIV infection. The aim of this meta-analysis was to determine the association of drug-resistant TB with anti-TB drug treatment history and HIV co-infection. After electronic based literature search in the databases of Medline, HINARI, EMBASE and the Cochrane library, article selection and data extraction were carried out. HIV co-infection and previous history of TB treatment were used as predictors for the occurrence of any anti-TB drug resistant or multiple drug resistant TB (MDR-TB). The risk ratios for each included study and for the pooled sample were computed using the random-effects model. Heterogeneity test, sensitivity analyses and funnel plots were also done. The pooled analysis showed that the risk of developing drug-resistant TB to at least one anti-TB drug was about 3 times higher in individuals who had a previous history of anti-TB treatment than new TB cases. The risk of having MDR-TB in previously anti-TB treated TB cases was more than 5-fold higher than that of new TB cases. Resistance to Ethambutol and Rifampicin was more than fivefold higher among the previously treated with anti-TB drugs. However, HIV infection was not associated with drug-resistant TB. There was a strong association of previous anti-TB treatment with MDR-TB. Primary treatment warrants special emphasis, and screening for anti-TB drugs sensitivity has to be strengthened.

  20. Computer organization and design the hardware/software interface

    CERN Document Server

    Patterson, David A

    2013-01-01

    The 5th edition of Computer Organization and Design moves forward into the post-PC era with new examples, exercises, and material highlighting the emergence of mobile computing and the cloud. This generational change is emphasized and explored with updated content featuring tablet computers, cloud infrastructure, and the ARM (mobile computing devices) and x86 (cloud computing) architectures. Because an understanding of modern hardware is essential to achieving good performance and energy efficiency, this edition adds a new concrete example, "Going Faster," used throughout the text to demonstrate extremely effective optimization techniques. Also new to this edition is discussion of the "Eight Great Ideas" of computer architecture. As with previous editions, a MIPS processor is the core used to present the fundamentals of hardware technologies, assembly language, computer arithmetic, pipelining, memory hierarchies and I/O. Optimization techniques featured throughout the text. It covers parallelism in depth with...

  1. A New Soft Computing Method for K-Harmonic Means Clustering.

    Science.gov (United States)

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  2. Undergraduate students’ challenges with computational modelling in physics

    Directory of Open Access Journals (Sweden)

    Simen A. Sørby

    2012-12-01

    Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.

  3. Computer users' risk factors for developing shoulder, elbow and back symptoms

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Søgaard, Karen; Strøyer, Jesper

    2004-01-01

    OBJECTIVES: This prospective study concentrated on determining factors of computer work that predict musculoskeletal symptoms in the shoulder, elbow, and low-back regions. METHODS: A questionnaire on ergonomics, work pauses, work techniques, and psychosocial and work factors was delivered to 5033......, and previous symptoms was a significant predictor for symptoms in all regions. Computer worktime and psychosocial dimensions were not significant predictors. CONCLUSIONS: Influence on work pauses, reduction of glare or reflection, and screen height are important factors in the design of future computer...... office workers at baseline in early 1999 (response rate 69%) and to 3361 respondents at the time of the follow-up in late 2000 (response rate 77%). An increased frequency or intensity of symptoms was the outcome variable, including only nonsymptomatic respondents from the baseline questionnaire (symptom...

  4. Muoniated radical states in the group 16 elements: Computational studies

    International Nuclear Information System (INIS)

    Macrae, Roderick M.

    2009-01-01

    Recent experimental studies on positive muon implantation in silicon, selenium, and tellurium have been interpreted on the basis that the primary paramagnetic species observed is XMu (X=S, Se, or Te), the muonium-substituted analog of the appropriate diatomic chalcogen monohydride radical. However, temperature-dependent signal visibility, broadening, and hyperfine shift effects remain puzzling. The interplay of degeneracy, spin-orbit coupling, and vibrational averaging in these species makes them computationally challenging despite their small size. In this work computational studies are carried out on all hydrogen isotopomers of the series OH, SH, SeH, and TeH. Several different methodological approaches are compared, and the effects of wavefunction symmetry, spin-orbit coupling, and zero-point vibrational corrections on the isotropic and anisotropic components of the hyperfine interaction are examined. Additionally, some models of the Mu site in rhombic sulfur are briefly considered.

  5. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    Science.gov (United States)

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  6. An Exploratory Study of the Implementation of Computer Technology in an American Islamic Private School

    Science.gov (United States)

    Saleem, Mohammed M.

    2009-01-01

    This exploratory study of the implementation of computer technology in an American Islamic private school leveraged the case study methodology and ethnographic methods informed by symbolic interactionism and the framework of the Muslim Diaspora. The study focused on describing the implementation of computer technology and identifying the…

  7. Computer Architecture Techniques for Power-Efficiency

    CERN Document Server

    Kaxiras, Stefanos

    2008-01-01

    In the last few years, power dissipation has become an important design constraint, on par with performance, in the design of new computer systems. Whereas in the past, the primary job of the computer architect was to translate improvements in operating frequency and transistor count into performance, now power efficiency must be taken into account at every step of the design process. While for some time, architects have been successful in delivering 40% to 50% annual improvement in processor performance, costs that were previously brushed aside eventually caught up. The most critical of these

  8. Predictive factors for the development of diabetes in women with previous gestational diabetes mellitus

    DEFF Research Database (Denmark)

    Damm, P.; Kühl, C.; Bertelsen, Aksel

    1992-01-01

    OBJECTIVES: The purpose of this study was to determine the incidence of diabetes in women with previous dietary-treated gestational diabetes mellitus and to identify predictive factors for development of diabetes. STUDY DESIGN: Two to 11 years post partum, glucose tolerance was investigated in 241...... women with previous dietary-treated gestational diabetes mellitus and 57 women without previous gestational diabetes mellitus (control group). RESULTS: Diabetes developed in 42 (17.4%) women with previous gestational diabetes mellitus (3.7% insulin-dependent diabetes mellitus and 13.7% non...... of previous patients with gestational diabetes mellitus in whom plasma insulin was measured during an oral glucose tolerance test in late pregnancy a low insulin response at diagnosis was found to be an independent predictive factor for diabetes development. CONCLUSIONS: Women with previous dietary...

  9. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  10. A study of visual and musculoskeletal health disorders among computer professionals in NCR Delhi

    Directory of Open Access Journals (Sweden)

    Talwar Richa

    2009-01-01

    Full Text Available Objective: To study the prevalence of health disorders among computer professionals and its association with working environment conditions. Study design: Cross sectional. Materials and Methods: A sample size of 200 computer professionals, from Delhi and NCR which included software developers, call centre workers, and data entry workers. Result: The prevalence of visual problems in the study group was 76% (152/200, and musculoskeletal problems were reported by 76.5% (153/200. It was found that there was a gradual increase in visual complaints as the number of hours spent for working on computers daily increased and the same relation was found to be true for musculoskeletal problems as well. Visual problems were less in persons using antiglare screen, and those with adequate lighting in the room. Musculoskeletal problems were found to be significantly lesser among those using cushioned chairs and soft keypad. Conclusion: A significant proportion of the computer professionals were found to be having health problems and this denotes that the occupational health of the people working in the computer field needs to be emphasized as a field of concern in occupational health.

  11. Concatenated codes for fault tolerant quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.; Laflamme, R.; Zurek, W.

    1995-05-01

    The application of concatenated codes to fault tolerant quantum computing is discussed. We have previously shown that for quantum memories and quantum communication, a state can be transmitted with error {epsilon} provided each gate has error at most c{epsilon}. We show how this can be used with Shor`s fault tolerant operations to reduce the accuracy requirements when maintaining states not currently participating in the computation. Viewing Shor`s fault tolerant operations as a method for reducing the error of operations, we give a concatenated implementation which promises to propagate the reduction hierarchically. This has the potential of reducing the accuracy requirements in long computations.

  12. Exploration of cloud computing late start LDRD #149630 : Raincoat. v. 2.1.

    Energy Technology Data Exchange (ETDEWEB)

    Echeverria, Victor T.; Metral, Michael David; Leger, Michelle A.; Gabert, Kasimir Georg; Edgett, Patrick Garrett; Thai, Tan Q.

    2010-09-01

    This report contains documentation from an interoperability study conducted under the Late Start LDRD 149630, Exploration of Cloud Computing. A small late-start LDRD from last year resulted in a study (Raincoat) on using Virtual Private Networks (VPNs) to enhance security in a hybrid cloud environment. Raincoat initially explored the use of OpenVPN on IPv4 and demonstrates that it is possible to secure the communication channel between two small 'test' clouds (a few nodes each) at New Mexico Tech and Sandia. We extended the Raincoat study to add IPSec support via Vyatta routers, to interface with a public cloud (Amazon Elastic Compute Cloud (EC2)), and to be significantly more scalable than the previous iteration. The study contributed to our understanding of interoperability in a hybrid cloud.

  13. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  14. An agent-based computational model for tuberculosis spreading on age-structured populations

    Science.gov (United States)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  15. [Computer-assisted navigation in orthognathic surgery. Application to Le Fort I osteotomy.

    Science.gov (United States)

    Benassarou, M; Benassarou, A; Meyer, C

    2013-08-05

    Computer-assisted navigation is a tool that allows the surgeon to reach intraoperatively a previously defined target. This technique can be applied to the positioning of bone fragments in orthognathic surgery. It is not used routinely yet because there are no specifically dedicated systems available on the market for this kind of surgery. The goal of our study was to describe the various systems that could be used in orthognathic surgery and to report our experience of computer-assisted surgery in the positioning of the maxilla during maxillomandibular osteotomies. Copyright © 2013. Published by Elsevier Masson SAS.

  16. Computational study of the heat transfer of an avian egg in a tray.

    Science.gov (United States)

    Eren Ozcan, S; Andriessens, S; Berckmans, D

    2010-04-01

    The development of an embryo in an avian egg depends largely on its temperature. The embryo temperature is affected by its environment and the heat produced by the egg. In this paper, eggshell temperature and the heat transfer characteristics from one egg in a tray toward its environment are studied by means of computational fluid dynamics (CFD). Computational fluid dynamics simulations have the advantage of providing extensive 3-dimensional information on velocity and eggshell temperature distribution around an egg that otherwise is not possible to obtain by experiments. However, CFD results need to be validated against experimental data. The objectives were (1) to find out whether CFD can successfully simulate eggshell temperature from one egg in a tray by comparing to previously conducted experiments, (2) to visualize air flow and air temperature distribution around the egg in a detailed way, and (3) to perform sensitivity analysis on several variables affecting heat transfer. To this end, a CFD model was validated using 2 sets of temperature measurements yielding an effective model. From these simulations, it can be concluded that CFD can effectively be used to analyze heat transfer characteristics and eggshell temperature distribution around an egg. In addition, air flow and temperature distribution around the egg are visualized. It has been observed that temperature differences up to 2.6 degrees C are possible at high heat production (285 mW) and horizontal low flow rates (0.5 m/s). Sensitivity analysis indicates that average eggshell temperature is mainly affected by the inlet air velocity and temperature, flow direction, and the metabolic heat of the embryo and less by the thermal conductivity and emissivity of the egg and thermal emissivity of the tray.

  17. Frictional lichenified dermatosis from prolonged use of a computer mouse: Case report and review of the literature of computer-related dermatoses.

    Science.gov (United States)

    Ghasri, Pedram; Feldman, Steven R

    2010-12-15

    Despite the increasing reliance on computers and the associated health risks, computer-related dermatoses remain under-represented in the literature. This term collectively refers to four groups of cutaneous pathologies: 1) allergic contact dermatitis from exposure to certain chemicals in computer accessories, 2) various friction-induced hand lesions resulting from prolonged computer use, 3) erythema ab igne from placement of the laptop on the skin, and 4) "screen dermatitis" from excessive exposure to visual display terminals (VDTs). Within this review we also present a case of a friction-induced lichenified dermatosis in the dominant wrist of a 24-year-old female that was caused by excessive use of her computer mouse. More importantly, we review the literature of all previously reported cases of computer-related dermatoses, so as to promote recognition and appropriate management by both patients and physicians.

  18. Computed Tomography Features of Benign and Malignant Calcified Thyroid Nodules: A Single-Center Study.

    Science.gov (United States)

    Kim, Donghyun; Kim, Dong Wook; Heo, Young Jin; Baek, Jin Wook; Lee, Yoo Jin; Park, Young Mi; Baek, Hye Jin; Jung, Soo Jin

    No previous studies have investigated thyroid calcification on computed tomography (CT) quantitatively by using Hounsfield unit (HU) values. This study aimed to analyze quantitative HU values of thyroid calcification on preoperative neck CT and to assess the characteristics of benign and malignant calcified thyroid nodules (CTNs). Two hundred twenty patients who underwent neck CT before thyroid surgery from January 2015 to June 2016 were included. On soft-tissue window CT images, CTNs with calcified components of 3 mm or larger in minimum diameter were included in this study. The HU values and types of CTNs were determined and analyzed. Of 61 CTNs in 49 patients, there were 42 malignant nodules and 19 benign nodules. The mean largest diameter of the calcified component was 5.3 (2.5) mm (range, 3.1-17.1 mm). A statistically significant difference was observed in the HU values of calcified portions between benign and malignant CTNs, whereas there was no significant difference in patient age or sex or in the size, location, or type of each CTN. Of the 8 CTNs with pure calcification, 3 exhibited a honeycomb pattern on bone window CT images, and these 3 CTNs were all diagnosed as papillary thyroid carcinoma on histopathological examination. Hounsfield unit values of CTNs may be helpful for differentiating malignancy from benignity.

  19. 49 CFR 173.23 - Previously authorized packaging.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Previously authorized packaging. 173.23 Section... REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Preparation of Hazardous Materials for Transportation § 173.23 Previously authorized packaging. (a) When the regulations specify a packaging with a specification marking...

  20. Effects of Belongingness and Synchronicity on Face-to-Face and Computer-Mediated Online Cooperative Pedagogy

    Science.gov (United States)

    Saltarelli, Andrew John

    2012-01-01

    Previous research suggests asynchronous online computer-mediated communication (CMC) has deleterious effects on certain cooperative learning pedagogies (e.g., constructive controversy), but the processes underlying this effect and how it may be ameliorated remain unclear. This study tests whether asynchronous CMC thwarts belongingness needs…

  1. Educational Computer Use in Leisure Contexts: A Phenomenological Study of Adolescents' Experiences at Internet Cafes

    Science.gov (United States)

    Cilesiz, Sebnem

    2009-01-01

    Computer use is a widespread leisure activity for adolescents. Leisure contexts, such as Internet cafes, constitute specific social environments for computer use and may hold significant educational potential. This article reports a phenomenological study of adolescents' experiences of educational computer use at Internet cafes in Turkey. The…

  2. Efficient 3D geometric and Zernike moments computation from unstructured surface meshes.

    Science.gov (United States)

    Pozo, José María; Villa-Uriol, Maria-Cruz; Frangi, Alejandro F

    2011-03-01

    This paper introduces and evaluates a fast exact algorithm and a series of faster approximate algorithms for the computation of 3D geometric moments from an unstructured surface mesh of triangles. Being based on the object surface reduces the computational complexity of these algorithms with respect to volumetric grid-based algorithms. In contrast, it can only be applied for the computation of geometric moments of homogeneous objects. This advantage and restriction is shared with other proposed algorithms based on the object boundary. The proposed exact algorithm reduces the computational complexity for computing geometric moments up to order N with respect to previously proposed exact algorithms, from N(9) to N(6). The approximate series algorithm appears as a power series on the rate between triangle size and object size, which can be truncated at any desired degree. The higher the number and quality of the triangles, the better the approximation. This approximate algorithm reduces the computational complexity to N(3). In addition, the paper introduces a fast algorithm for the computation of 3D Zernike moments from the computed geometric moments, with a computational complexity N(4), while the previously proposed algorithm is of order N(6). The error introduced by the proposed approximate algorithms is evaluated in different shapes and the cost-benefit ratio in terms of error, and computational time is analyzed for different moment orders.

  3. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged ..... I am still riding the cloud … I hope it lasts. .... as a way of creating a climate and culture in schools where individuals are willing to explore.

  4. Approaching gender parity: Women in computer science at Afghanistan's Kabul University

    Science.gov (United States)

    Plane, Jandelyn

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate

  5. 22 CFR 40.91 - Certain aliens previously removed.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  6. Is email a reliable means of contacting authors of previously published papers? A study of the Emergency Medicine Journal for 2001.

    Science.gov (United States)

    O'Leary, F

    2003-07-01

    To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.

  7. Efficient 2-D DCT Computation from an Image Representation Point of View

    OpenAIRE

    Papakostas, G.A.; Koulouriotis, D.E.; Karakasis, E.G.

    2009-01-01

    A novel methodology that ensures the computation of 2-D DCT coefficients in gray-scale images as well as in binary ones, with high computation rates, was presented in the previous sections. Through a new image representation scheme, called ISR (Image Slice Representation) the 2-D DCT coefficients can be computed in significantly reduced time, with the same accuracy.

  8. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Cheng, Siu-Wing

    2014-09-01

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (logn)logr) time. It improves on the previously best known algorithm for this reduction, which is randomized, and runs in expected O(n√h+1log2n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (logn) logr + r 4/3 + ε ) time for any ε > 0. On degenerate input, our time bound increases to O(n (logn) logr + r 17/11 + ε ).

  9. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Mencel, Liam A.

    2014-05-06

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (log n) log r) time. It improves on the previously best known algorithm for this reduction, which is randomised, and runs in expected O(n √(h+1) log² n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (log n) log r + r^(4/3 + ε)) time for any ε > 0. On degenerate input, our time bound increases to O(n (log n) log r + r^(17/11 + ε))

  10. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Cheng, Siu-Wing; Mencel, Liam A.; Vigneron, Antoine E.

    2014-01-01

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (logn)logr) time. It improves on the previously best known algorithm for this reduction, which is randomized, and runs in expected O(n√h+1log2n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (logn) logr + r 4/3 + ε ) time for any ε > 0. On degenerate input, our time bound increases to O(n (logn) logr + r 17/11 + ε ).

  11. Defining Effectiveness Using Finite Sets A Study on Computability

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Haeusler, Edward H.; Garcia, Alex

    2016-01-01

    finite sets and uses category theory as its mathematical foundations. The model relies on the fact that every function between finite sets is computable, and that the finite composition of such functions is also computable. Our approach is an alternative to the traditional model-theoretical based works...... which rely on (ZFC) set theory as a mathematical foundation, and our approach is also novel when compared to the already existing works using category theory to approach computability results. Moreover, we show how to encode Turing machine computations in the model, thus concluding the model expresses...

  12. Does the patients′ educational level and previous counseling affect their medication knowledge?

    Directory of Open Access Journals (Sweden)

    Abdulmalik M Alkatheri

    2013-01-01

    Conclusions: The education level of the patient and previous counseling are positively linked to medication knowledge. Knowledge of the medications′ side effects proved to be the most difficult task for the participants in this study, requiring the highest level of education, and was improved by previous counseling.

  13. Parallel computation of rotating flows

    DEFF Research Database (Denmark)

    Lundin, Lars Kristian; Barker, Vincent A.; Sørensen, Jens Nørkær

    1999-01-01

    This paper deals with the simulation of 3‐D rotating flows based on the velocity‐vorticity formulation of the Navier‐Stokes equations in cylindrical coordinates. The governing equations are discretized by a finite difference method. The solution is advanced to a new time level by a two‐step process....... In the first step, the vorticity at the new time level is computed using the velocity at the previous time level. In the second step, the velocity at the new time level is computed using the new vorticity. We discuss here the second part which is by far the most time‐consuming. The numerical problem...

  14. Use of cone-beam computed tomography in diagnosing and treating endodontic treatment failure: A case study

    Directory of Open Access Journals (Sweden)

    Gloria Lee

    2017-01-01

    Full Text Available The use of cone-beam computed tomography (CBCT as a complementary imaging modality applies to various clinical situations that with conventional two-dimensional radiographs alone may pose diagnostic challenges. These challenges include but are not limited to locating missed canals in endodontic retreatment and diagnosing the presence of lesions such as resorption, periapical bone defects, root fractures, and perforations. In this study, we present a case of an asymptomatic apical periodontitis that was incidentally found on a panoramic radiograph. Analyses based on panoramic and periapical radiographs and clinical examinations were insufficient for definitive diagnosis, which necessitated the use of CBCT. The CBCT scan allowed identification of the cause of the apical disease, an unfilled mesiolingual canal in previously root canal treated left mandibular second molar, as well as the extent of the lesion. We also explore the diagnostic challenges in using traditional two-dimensional radiographs only, the challenges in locating root canals in mandibular second molars, and risks and benefits in using CBCT.

  15. Longitudinal Study of Factors Impacting the Implementation of Notebook Computer Based CAD Instruction

    Science.gov (United States)

    Goosen, Richard F.

    2009-01-01

    This study provides information for higher education leaders that have or are considering conducting Computer Aided Design (CAD) instruction using student owned notebook computers. Survey data were collected during the first 8 years of a pilot program requiring engineering technology students at a four year public university to acquire a notebook…

  16. Serendipity? Are There Gender Differences in the Adoption of Computers? A Case Study.

    Science.gov (United States)

    Vernon-Gerstenfeld, Susan

    1989-01-01

    Discusses a study about the effect of learning styles of patent examiners on adoption of computers. Subjects' amount of computer use was a function of learning style, age, comfort after training, and gender. Findings indicate that women showed a greater propensity to adopt than men. Discusses implications for further research. (JS)

  17. High throughput computing: a solution for scientific analysis

    Science.gov (United States)

    O'Donnell, M.

    2011-01-01

    Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:

  18. Costs of cloud computing for a biometry department. A case study.

    Science.gov (United States)

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  19. SONOGRAPHIC PREDICTION OF SCAR DEHISCENCE IN WOMEN WITH PREVIOUS CAESAREAN SECTION

    Directory of Open Access Journals (Sweden)

    Shubhada Suhas Jajoo

    2018-01-01

    Full Text Available BACKGROUND Caesarean section (Sectio Caesarea is a surgical method for the completion of delivery. After various historical modifications of operative techniques, modern approach consists in the transverse dissection of the anterior wall of the uterus. The rate of vaginal birth after caesarean section was significantly reduced from year to year and the rate of repeated caesarean section is increased during the past 10 years. Evaluation of scar thickness is done by ultrasound, but it is still debatable size of thick scar that would be guiding “cut-off value” for the completion of the delivery method. To better assess the risk of uterine rupture, some authors have proposed sonographic measurement of lower uterine segment thickness near term assuming that there is an inverse correlation between LUS thickness and the risk of uterine scar defect. Therefore, this assessment for the management of women with prior CS may increase safety during labour by selecting women with the lowest risk of uterine rupture. The aim of the study is to study the diagnostic accuracy of sonographic measurements of the Lower Uterine Segment (LUS thickness near term in predicting uterine scar defects in women with prior Caesarean Section (CS. We aim to ascertain the best cut-off values for predicting uterine rupture. MATERIALS AND METHODS 100 antenatal women with history of previous one LSCS who come to attend antenatal clinic will be assessed for scar thickness by transabdominal ultrasonography and its correlation with intraoperative findings. This prospective longitudinal study was conducted for 1 year after IEC approval with inclusion criteria previous one LSCS. Exclusion criteria- 1 Previous myomectomy scar; 2 Previous 2 LSCS; 3 Previous hysterotomy scar. RESULTS Our findings indicate that there is a strong association between degree of LUS thinning measured near term and the risk of uterine scar defect at birth. In our study, optimal cut-off value for predicting

  20. Limits on efficient computation in the physical world

    Science.gov (United States)

    Aaronson, Scott Joel

    More than a speculative technology, quantum computing seems to challenge our most basic intuitions about how the physical world should behave. In this thesis I show that, while some intuitions from classical computer science must be jettisoned in the light of modern physics, many others emerge nearly unscathed; and I use powerful tools from computational complexity theory to help determine which are which. In the first part of the thesis, I attack the common belief that quantum computing resembles classical exponential parallelism, by showing that quantum computers would face serious limitations on a wider range of problems than was previously known. In particular, any quantum algorithm that solves the collision problem---that of deciding whether a sequence of n integers is one-to-one or two-to-one---must query the sequence O (n1/5) times. This resolves a question that was open for years; previously no lower bound better than constant was known. A corollary is that there is no "black-box" quantum algorithm to break cryptographic hash functions or solve the Graph Isomorphism problem in polynomial time. I also show that relative to an oracle, quantum computers could not solve NP-complete problems in polynomial time, even with the help of nonuniform "quantum advice states"; and that any quantum algorithm needs O (2n/4/n) queries to find a local minimum of a black-box function on the n-dimensional hypercube. Surprisingly, the latter result also leads to new classical lower bounds for the local search problem. Finally, I give new lower bounds on quantum one-way communication complexity, and on the quantum query complexity of total Boolean functions and recursive Fourier sampling. The second part of the thesis studies the relationship of the quantum computing model to physical reality. I first examine the arguments of Leonid Levin, Stephen Wolfram, and others who believe quantum computing to be fundamentally impossible. I find their arguments unconvincing without a "Sure

  1. Computing requirements for S.S.C. accelerator design and studies

    International Nuclear Information System (INIS)

    Dragt, A.; Talman, R.; Siemann, R.; Dell, G.F.; Leemann, B.; Leemann, C.; Nauenberg, U.; Peggs, S.; Douglas, D.

    1984-01-01

    We estimate the computational hardware resources that will be required for accelerator physics studies during the design of the Superconducting SuperCollider. It is found that both Class IV and Class VI facilities (1) will be necessary. We describe a user environment for these facilities that is desirable within the context of accelerator studies. An acquisition scenario for these facilities is presented

  2. A computer-based time study system for timber harvesting operations

    Science.gov (United States)

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  3. Study of tip loss corrections using CFD rotor computations

    DEFF Research Database (Denmark)

    Shen, Wen Zhong; Zhu, Wei Jun; Sørensen, Jens Nørkær

    2014-01-01

    Tip loss correction is known to play an important role for engineering prediction of wind turbine performance. There are two different types of tip loss corrections: tip corrections on momentum theory and tip corrections on airfoil data. In this paper, we study the latter using detailed CFD...... computations for wind turbines with sharp tip. Using the technique of determination of angle of attack and the CFD results for a NordTank 500 kW rotor, airfoil data are extracted and a new tip loss function on airfoil data is derived. To validate, BEM computations with the new tip loss function are carried out...... and compared with CFD results for the NordTank 500 kW turbine and the NREL 5 MW turbine. Comparisons show that BEM with the new tip loss function can predict correctly the loading near the blade tip....

  4. Comparative study of scintigraphy, ultrasonography and computed tomography in the evaluation of liver tumours

    International Nuclear Information System (INIS)

    Tohyama, Junko; Ishigaki, Takeo; Ishikawa, Tsutomu

    1982-01-01

    A comparative study of scintigraphy, ultrasonography and computed tomography in 67 proven patients with clinically suspected liver tumours was reported. Scintigraphy was superior in sensitivity to ultrasonography and computed tomography. However, in specificity, scintigraphy was inferior to other two. Diagnostic efficacy of ultrasonography and computed tomography in detecting focal masses of the liver was not greatly different, and simultaneous interpretation of ultrasonogram and computed tomogram was more helpful than independent interpretation. So they were thought to be complementary. In conclusion, scintigraphy was thought to be the initial procedure in the diagnostic approach for focal liver masses and ultrasonography was second procedure because of no radiation hazards. And computed tomography should follow then. (author)

  5. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  6. Persistent Neck and Shoulder Pains among Computer Office Workers: A Longitudinal Study

    Directory of Open Access Journals (Sweden)

    Farideh Sadeghian

    2012-11-01

    Full Text Available Please cite this article as: Sadeghian F, Raei M, Amiri M. Persistent Neck and Shoulder Pains among Computer Office Workers: A Longitudinal Study. Arch Hyg Sci 2012;1(2:33-40. Background & Aims of the Study: In developing countries, with increasing use of computer systems, millions of computer workers are at high risk of neck and shoulder pains. The aim of this study was to assess the relationships between work-related physical and psychosocial factors and persistent neck and shoulder pains among computer office workers. Materials & Methods : This longitudinal study with 1-year follow-up was conducted among all eligible computer office workers (n=182 of Shahroud universities (northeastern Iran in 2009-2010. “Cultural and Psychosocial Influences on Disability (CUPID” questionnaire was used to collect data on demographic characteristics, physical, organizational and psychosocial factors at work, and neck and shoulder symptoms. Chi square and logistic regression analysis was used to analyze the data through SPSS version 16. Results: Computer office workers with the mean±SD age of 32.1±6.7 years and the mean±SD weekly work hours of 47.4±8.2 participated in this study. At the baseline 39.6% of workers reported neck and shoulder pains. At one year follow-up, 59.7% of them reported neck pain and 51.3% reported shoulder pain. Significant relationships were found between persistence of neck and shoulder pains and age, gender, and decision latitude at work. Conclusions: Although neck and shoulder pains were equally prevalent among the study group, after one year follow up, persistent neck pain was more than shoulder pain. Age, gender, and decision latitude at work were identified as risk factors for both pains. References: 1. Buckle PW, Devereux JJ. The nature of work-related neck and upper limb musculoskeletal disorders. Appl Ergon 2002;33(3:207–17. 2. Tinubu BMS, Mbada CE, Oyeyemi AL, Fabunmi AA. Work-Related Musculoskeletal Disorders among

  7. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    OpenAIRE

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods Th...

  8. From Parkinsonian thalamic activity to restoring thalamic relay using deep brain stimulation: new insights from computational modeling

    NARCIS (Netherlands)

    Meijer, Hil Gaétan Ellart; Krupa, M.; Cagnan, H.; Lourens, Marcel Antonius Johannes; Heida, Tjitske; Martens, H.C.F.; Bour, L.J.; van Gils, Stephanus A.

    2011-01-01

    We present a computational model of a thalamocortical relay neuron for exploring basal ganglia thalamocortical loop behavior in relation to Parkinson's disease and deep brain stimulation (DBS). Previous microelectrode, single-unit recording studies demonstrated that oscillatory interaction within

  9. Do ergonomics improvements increase computer workers' productivity?: an intervention study in a call centre.

    Science.gov (United States)

    Smith, Michael J; Bayehi, Antoinette Derjani

    2003-01-15

    This paper examines whether improving physical ergonomics working conditions affects worker productivity in a call centre with computer-intensive work. A field study was conducted at a catalogue retail service organization to explore the impact of ergonomics improvements on worker production. There were three levels of ergonomics interventions, each adding incrementally to the previous one. The first level was ergonomics training for all computer users accompanied by workstation ergonomics analysis leading to specific customized adjustments to better fit each worker (Group C). The second level added specific workstation accessories to improve the worker fit if the ergonomics analysis indicated a need for them (Group B). The third level met Group B requirements plus an improved chair (Group A). Productivity data was gathered from 72 volunteer participants who received ergonomics improvements to their workstations and 370 control subjects working in the same departments. Daily company records of production outputs for each worker were taken before ergonomics intervention (baseline) and 12 months after ergonomics intervention. Productivity improvement from baseline to 12 months post-intervention was examined across all ergonomics conditions combined, and also compared to the control group. The findings showed that worker performance increased for 50% of the ergonomics improvement participants and decreased for 50%. Overall, there was a 4.87% output increase for the ergonomics improvement group as compared to a 3.46% output decrease for the control group. The level of productivity increase varied by the type of the ergonomics improvements with Group C showing the best improvement (9.43%). Even though the average production improved, caution must be used in interpreting the findings since the ergonomics interventions were not successful for one-half of the participants.

  10. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  11. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  12. MMA-EoS: A Computational Framework for Mineralogical Thermodynamics

    Science.gov (United States)

    Chust, T. C.; Steinle-Neumann, G.; Dolejš, D.; Schuberth, B. S. A.; Bunge, H.-P.

    2017-12-01

    We present a newly developed software framework, MMA-EoS, that evaluates phase equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization, with application to mantle petrology. The code is versatile in terms of the equation-of-state and mixing properties and allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Currently, the open program distribution contains equation-of-state formulations widely used, that is, Caloric-Murnaghan, Caloric-Modified-Tait, and Birch-Murnaghan-Mie-Grüneisen-Debye models, with published databases included. Through its modular design and easily scripted database, MMA-EoS can readily be extended with new formulations of equations-of-state and changes or extensions to thermodynamic data sets. We demonstrate the application of the program by reproducing and comparing physical properties of mantle phases and assemblages with previously published work and experimental data, successively increasing complexity, up to computing phase equilibria of six-component compositions. Chemically complex systems allow us to trace the budget of minor chemical components in order to explore whether they lead to the formation of new phases or extend stability fields of existing ones. Self-consistently computed thermophysical properties for a homogeneous mantle and a mechanical mixture of slab lithologies show no discernible differences that require a heterogeneous mantle structure as has been suggested previously. Such examples illustrate how thermodynamics of mantle mineralogy can advance the study of Earth's interior.

  13. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  14. A 3-D Computational Study of a Variable Camber Continuous Trailing Edge Flap (VCCTEF) Spanwise Segment

    Science.gov (United States)

    Kaul, Upender K.; Nguyen, Nhan T.

    2015-01-01

    Results of a computational study carried out to explore the effects of various elastomer configurations joining spanwise contiguous Variable Camber Continuous Trailing Edge Flap (VCCTEF) segments are reported here. This research is carried out as a proof-of-concept study that will seek to push the flight envelope in cruise with drag optimization as the objective. The cruise conditions can be well off design such as caused by environmental conditions, maneuvering, etc. To handle these off-design conditions, flap deflection is used so when the flap is deflected in a given direction, the aircraft angle of attack changes accordingly to maintain a given lift. The angle of attack is also a design parameter along with the flap deflection. In a previous 2D study,1 the effect of camber was investigated and the results revealed some insight into the relative merit of various camber settings of the VCCTEF. The present state of the art has not advanced sufficiently to do a full 3-D viscous analysis of the whole NASA Generic Transport Model (GTM) wing with VCCTEF deployed with elastomers. Therefore, this study seeks to explore the local effects of three contiguous flap segments on lift and drag of a model devised here to determine possible trades among various flap deflections to achieve desired lift and drag results. Although this approach is an approximation, it provides new insights into the "local" effects of the relative deflections of the contiguous spanwise flap systems and various elastomer segment configurations. The present study is a natural extension of the 2-D study to assess these local 3-D effects. Design cruise condition at 36,000 feet at free stream Mach number of 0.797 and a mean aerodynamic chord (MAC) based Reynolds number of 30.734x10(exp 6) is simulated for an angle of attack (AoA) range of 0 to 6 deg. In the previous 2-D study, the calculations revealed that the parabolic arc camber (1x2x3) and circular arc camber (VCCTEF222) offered the best L

  15. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    International Nuclear Information System (INIS)

    Kim, Jae Duck; Kim, Seung Kug

    1990-01-01

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  16. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Duck; Kim, Seung Kug [Dept. of Oral Radiology, College of Dentistry, Chosun University, Kwangju (Korea, Republic of)

    1990-08-15

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  17. Towards the Selection of an Optimal Global Geopotential Model for the Computation of the Long-Wavelength Contribution: A Case Study of Ghana

    Directory of Open Access Journals (Sweden)

    Caleb Iddissah Yakubu

    2017-11-01

    Full Text Available The selection of a global geopotential model (GGM for modeling the long-wavelength for geoid computation is imperative not only because of the plethora of GGMs available but more importantly because it influences the accuracy of a geoid model. In this study, we propose using the Gaussian averaging function for selecting an optimal GGM and degree and order (d/o for the remove-compute-restore technique as a replacement for the direct comparison of terrestrial gravity anomalies and GGM anomalies, because ground data and GGM have different frequencies. Overall, EGM2008 performed better than all the tested GGMs and at an optimal d/o of 222. We verified the results by computing geoid models using Heck and Grüninger’s modification and validated them against GPS/trigonometric data. The results of the validation were consistent with those of the averaging process with EGM2008 giving the smallest standard deviation of 0.457 m at d/o 222, resulting in an 8% improvement over the previous geoid model. In addition, this geoid model, the Ghanaian Gravimetric Geoid 2017 (GGG 2017 may be used to replace second-order class II leveling, with an expected error of 6.8 mm/km for baselines ranging from 20 to 225 km.

  18. High-Throughput Computing on High-Performance Platforms: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Matteo, Turilli [Rutgers University; Angius, Alessio [Rutgers University; Oral, H Sarp [ORNL; De, K [University of Texas at Arlington; Klimentov, A [Brookhaven National Laboratory (BNL); Wells, Jack C. [ORNL; Jha, S [Rutgers University

    2017-10-01

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i) a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.

  19. A computational study of the supersonic coherent jet

    International Nuclear Information System (INIS)

    Jeong, Mi Seon; Kim, Heuy Dong

    2003-01-01

    In steel-making process of iron and steel industry, the purity and quality of steel can be dependent on the amount of CO contained in the molten metal. Recently, the supersonic oxygen jet is being applied to the molten metal in the electric furnace and thus reduces the CO amount through the chemical reactions between the oxygen jet and molten metal, leading to a better quality of steel. In this application, the supersonic oxygen jet is limited in the distance over which the supersonic velocity is maintained. In order to get longer supersonic jet propagation into the molten metal, a supersonic coherent jet is suggested as one of the alternatives which are applicable to the electric furnace system. It has a flame around the conventional supersonic jet and thus the entrainment effect of the surrounding gas into the supersonic jet is reduced, leading to a longer propagation of the supersonic jet. In this regard, gasdynamics mechanism about why the combustion phenomenon surrounding the supersonic jet causes the jet core length to be longer is not yet clarified. The present study investigates the major characteristics of the supersonic coherent jet, compared with the conventional supersonic jet. A computational study is carried out to solve the compressible, axisymmetric Navier-Stokes equations. The computational results of the supersonic coherent jet are compared with the conventional supersonic jets

  20. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  1. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    Science.gov (United States)

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  2. Studies of electron collisions with polyatomic molecules using distributed-memory parallel computers

    International Nuclear Information System (INIS)

    Winstead, C.; Hipes, P.G.; Lima, M.A.P.; McKoy, V.

    1991-01-01

    Elastic electron scattering cross sections from 5--30 eV are reported for the molecules C 2 H 4 , C 2 H 6 , C 3 H 8 , Si 2 H 6 , and GeH 4 , obtained using an implementation of the Schwinger multichannel method for distributed-memory parallel computer architectures. These results, obtained within the static-exchange approximation, are in generally good agreement with the available experimental data. These calculations demonstrate the potential of highly parallel computation in the study of collisions between low-energy electrons and polyatomic gases. The computational methodology discussed is also directly applicable to the calculation of elastic cross sections at higher levels of approximation (target polarization) and of electronic excitation cross sections

  3. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  4. Gender and stereotypes in motivation to study computer programming for careers in multimedia

    Science.gov (United States)

    Doubé, Wendy; Lang, Catherine

    2012-03-01

    A multimedia university programme with relatively equal numbers of male and female students in elective programming subjects provided a rare opportunity to investigate female motivation to study and pursue computer programming in a career. The MSLQ was used to survey 85 participants. In common with research into deterrence of females from STEM domains, females displayed significantly lower self-efficacy and expectancy for success. In contrast to research into deterrence of females from STEM domains, both genders placed similar high values on computer programming and shared high extrinsic and intrinsic goal orientation. The authors propose that the stereotype associated with a creative multimedia career could attract female participation in computer programming whereas the stereotype associated with computer science could be a deterrent.

  5. Teaching programming to non-STEM novices: a didactical study of computational thinking and non-STEM computing education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid

    research approach. Computational thinking plays a significant role in computing education but it is still unclear how it should be interpreted to best serve its purpose. Constructionism and Computational Making seems to be promising frameworks to do this. In regards to specific teaching activities...

  6. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged school principals in North-West Province. Evans's theory of job satisfaction, morale and motivation was useful as a conceptual framework. A mixedmethods explanatory research design was important in discovering issues with ...

  7. Supplemental computational phantoms to estimate out-of-field absorbed dose in photon radiotherapy

    Science.gov (United States)

    Gallagher, Kyle J.; Tannous, Jaad; Nabha, Racile; Feghali, Joelle Ann; Ayoub, Zeina; Jalbout, Wassim; Youssef, Bassem; Taddei, Phillip J.

    2018-01-01

    The purpose of this study was to develop a straightforward method of supplementing patient anatomy and estimating out-of-field absorbed dose for a cohort of pediatric radiotherapy patients with limited recorded anatomy. A cohort of nine children, aged 2-14 years, who received 3D conformal radiotherapy for low-grade localized brain tumors (LBTs), were randomly selected for this study. The extent of these patients’ computed tomography simulation image sets were cranial only. To approximate their missing anatomy, we supplemented the LBT patients’ image sets with computed tomography images of patients in a previous study with larger extents of matched sex, height, and mass and for whom contours of organs at risk for radiogenic cancer had already been delineated. Rigid fusion was performed between the LBT patients’ data and that of the supplemental computational phantoms using commercial software and in-house codes. In-field dose was calculated with a clinically commissioned treatment planning system, and out-of-field dose was estimated with a previously developed analytical model that was re-fit with parameters based on new measurements for intracranial radiotherapy. Mean doses greater than 1 Gy were found in the red bone marrow, remainder, thyroid, and skin of the patients in this study. Mean organ doses between 150 mGy and 1 Gy were observed in the breast tissue of the girls and lungs of all patients. Distant organs, i.e. prostate, bladder, uterus, and colon, received mean organ doses less than 150 mGy. The mean organ doses of the younger, smaller LBT patients (0-4 years old) were a factor of 2.4 greater than those of the older, larger patients (8-12 years old). Our findings demonstrated the feasibility of a straightforward method of applying supplemental computational phantoms and dose-calculation models to estimate absorbed dose for a set of children of various ages who received radiotherapy and for whom anatomies were largely missing in their original

  8. Robotics as an integration subject in the computer science university studies. The experience of the University of Almeria

    Directory of Open Access Journals (Sweden)

    Manuela Berenguel Soria

    2012-11-01

    Full Text Available This work presents a global view of the role of robotics in computer science studies, mainly in university degrees. The main motivation of the use of robotics in these studies deals with the following issues: robotics permits to put in practice many computer science fundamental topics, it is a multidisciplinary area which allows to complete the basic knowledge of any computer science student, it facilitates the practice and learning of basic competences of any engineer (for instance, teamwork, and there is a wide market looking for people with robotics knowledge. These ideas are discussed from our own experience in the University of Almeria acquired through the studies of Computer Science Technical Engineering, Computer Science Engineering, Computer Science Degree and Computer Science Postgraduate.

  9. Doxorubicin and ifosfamide combination chemotherapy in previously treated acute leukemia in adults: a Southwest Oncology Group pilot study.

    Science.gov (United States)

    Ryan, D H; Bickers, J N; Vial, R H; Hussein, K; Bottomley, R; Hewlett, J S; Wilson, H E; Stuckey, W J

    1980-01-01

    The Southwest Oncology Group did a limited institutional pilot study of the combination of doxorubicin and ifosfamide in the treatment of previously treated adult patients with acute leukemia. Thirty-four patients received one or two courses of the combination. All patients had received prior chemotherapy and 32 had received prior anthracycline chemotherapy. Three patients died before their responses could be fully evaluated. Fourteen patients achieved complete remission (41%) and one patient achieved partial remission. The complete remission rate was 27% for patients with acute myeloblastic leukemia (myelomonoblastic leukemia, monoblastic leukemia, and erythroleukemia) and 89% for patients with acute lymphocytic and undifferentiated leukemia (ALL). Toxic effects included severe hematologic reactions in 33 of 34 patients, hematuria in six patients, altered sensorium in one patient, and congestive heart failure in one patient. The safety of the combination was established and toxic side effects of this therapy were tolerable. The 89% complete remission rate for previously treated patients with ALL suggests that the combination of doxorubicin and ifosfamide may be particularly effective in ALL.

  10. Computational Studies of Snake Venom Toxins.

    Science.gov (United States)

    Ojeda, Paola G; Ramírez, David; Alzate-Morales, Jans; Caballero, Julio; Kaas, Quentin; González, Wendy

    2017-12-22

    Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin.

  11. Computational Studies of Snake Venom Toxins

    Directory of Open Access Journals (Sweden)

    Paola G. Ojeda

    2017-12-01

    Full Text Available Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin.

  12. Study of space--charge effect by computer

    International Nuclear Information System (INIS)

    Sasaki, T.

    1982-01-01

    The space--charge effect in high density electron beams (beam current approx.2 μA) focused by a uniform magnetic field is studied computationally. On an approximation of averaged space-- charge force, a theory of trajectory displacements of beam electrons is developed. The theory shows that the effect of the averaged space--charge force appears as a focal length stretch. The theory is confirmed not only qualitatively but also quantitatively by simulations. Empirical formulas for the trajectory displacement and the energy spread are presented. A comparison between the empirical formulas and some theoretical formulas is made, leading to a severe criticism on the theories of energy spreads

  13. Case Studies of Liberal Arts Computer Science Programs

    Science.gov (United States)

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  14. ATLANTIC DIP: simplifying the follow-up of women with previous gestational diabetes.

    LENUS (Irish Health Repository)

    Noctor, E

    2013-11-01

    Previous gestational diabetes (GDM) is associated with a significant lifetime risk of type 2 diabetes. In this study, we assessed the performance of HbA1c and fasting plasma glucose (FPG) measurements against that of 75 g oral glucose tolerance testing (OGTT) for the follow-up screening of women with previous GDM.

  15. REACHING THE COMPUTING HELP DESK

    CERN Multimedia

    Miguel MARQUINA; Roger WOOLNOUGH; IT/User Support

    1999-01-01

    The way to contact the Computing Help Desk (also known as 'UCO' and hosted by IT Division as an entry point for general computing issues) has been streamlined in order to facilitate access to it. A new telephone line and email address have been set: Phone number: 78888Email: Helpdesk@cern.chhopefully easier to remember. Both entries are operational since last December. The previous number and email address remain valid and have been turned into aliases of the above. However we encourage using the latter at your convenience from now on. For additional information please see the article published at the CERN Computing Newsletter 233:http://consult.cern.ch/cnl/233/art_uco.htmlDo not hesitate to contact us (by email to User.Relations@cern.ch) for additional information or feedback regarding this matter.Nicole Cremel, Miguel Marquina, Roger WoolnoughIT/UserSupport

  16. Misleading Performance Claims in Parallel Computations

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.

    2009-05-29

    In a previous humorous note entitled 'Twelve Ways to Fool the Masses,' I outlined twelve common ways in which performance figures for technical computer systems can be distorted. In this paper and accompanying conference talk, I give a reprise of these twelve 'methods' and give some actual examples that have appeared in peer-reviewed literature in years past. I then propose guidelines for reporting performance, the adoption of which would raise the level of professionalism and reduce the level of confusion, not only in the world of device simulation but also in the larger arena of technical computing.

  17. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  18. Morphometric Evaluation of Korean Femurs by Geometric Computation: Comparisons of the Sex and the Population

    Directory of Open Access Journals (Sweden)

    Ho-Jung Cho

    2015-01-01

    Full Text Available We measured 28 parameters of 202 femurs from Koreans by an automated geometric computation program using 3D models generated from computed tomography images. The measurement parameters were selected with reference to physical and forensic anthropology studies as well as orthopedic implant design studies. All measurements were calculated using 3D reconstructions on a computer using scientific computation language. We also analyzed sex and population differences by comparison with data from previous studies. Most parameters were larger in males than in females. The height, head diameter, head center offset, and chord length of the diaphysis, most parameters in the distal femur, and the isthmic width of the medullary canal were smaller in Koreans than in other populations. However, the neck-shaft angle, subtense, and width of the intercondylar notch in the distal femur were larger than those in other populations. The results of this study will be useful as a reference for physical and forensic anthropology as well as the design of medical devices suitable for Koreans.

  19. Low cost spacecraft computers: Oxymoron or future trend?

    Science.gov (United States)

    Manning, Robert M.

    1993-01-01

    Over the last few decades, application of current terrestrial computer technology in embedded spacecraft control systems has been expensive and wrought with many technical challenges. These challenges have centered on overcoming the extreme environmental constraints (protons, neutrons, gamma radiation, cosmic rays, temperature, vibration, etc.) that often preclude direct use of commercial off-the-shelf computer technology. Reliability, fault tolerance and power have also greatly constrained the selection of spacecraft control system computers. More recently, new constraints are being felt, cost and mass in particular, that have again narrowed the degrees of freedom spacecraft designers once enjoyed. This paper discusses these challenges, how they were previously overcome, how future trends in commercial computer technology will simplify (or hinder) selection of computer technology for spacecraft control applications, and what spacecraft electronic system designers can do now to circumvent them.

  20. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  1. A comparison of morbidity associated with placenta previa with and without previous caesarean sections

    International Nuclear Information System (INIS)

    Baqai, S.; Siraj, A.; Noor, N.

    2018-01-01

    To compare the morbidity associated with placenta previa with and without previous caesarean sections. Study Design: Retrospective comparative study. Place and Duration of Study: From March 2014 till March 2016 in the department of Obstetrics and Gynaecology at PNS Shifa hospital Karachi. Material and Methods: After the approval from hospital ethical committee, antenatal patients with singleton pregnancy of gestational age >32 weeks, in the age group of 20-40 years diagnosed to have placenta previa included in the study. All patients with twin pregnancy less than 20 years and more than 40 years of age were excluded. The records of all patients fulfilling the inclusion criteria were reviewed. Data had been collected for demographic and maternal variables, placenta previa, history of previous lower segment caesarean section (LSCS), complications associated with placenta previa and techniques used to control blood loss were recorded. Results: During the study period, 6879 patients were delivered in PNS Shifa, out of these, 2060 (29.9%) had caesarean section out of these, 47.3% patients had previous history of LSCS. Thirty three (1.6%) patients were diagnosed to have placenta previa and frequency of placenta previa was significantly higher in patients with previous history of LSCS than previous normal delivery of LSCS i.e. 22 vs. 11 (p=0.023). It was observed that the frequency of morbidly adherent placenta (MAP) and Intensive care unit (ICU) stay were significantly higher in patients with previous history of LSCS than previous history of normal delivery. Conclusion: Frequency of placenta previa was significantly higher in patients with history of LSCS. Also placenta previa remains a major risk factor for various maternal complications. (author)

  2. Computed tomography study of Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Arai, H; Kobayashi, K; Ikeda, Y; Nagao, Y; Ogihara, R; Kosaka, K

    1983-01-01

    Computed tomography (CT) was used to study cerebral atrophy in 18 patients with clinically diagnosed Alzheimer's disease of presenile type and in 14 healthy age-matched subjects as controls. Using the computerized planimetric method, Subarachnoid Space Volume Index and Ventricle Volume Index were calculated as the measure of cortical atrophy and ventricular dilatation respectively. From the results the following conclusions were drawn: 1. The cerebral atrophy in Alzheimer patients could be attributable to the disease processes rather than to physiological aging of the brain. 2. The degree of atrophy increases in parallel with the progress of the clinical stage, and the cortical atrophy is already apparent at an early stage, whereas the ventricular dilatation becomes pronounced at later stages. 3. CT could be one of the most useful clinical tests available for the diagnosis of Alzheimer's disease.

  3. Experiments and computation of onshore breaking solitary waves

    DEFF Research Database (Denmark)

    Jensen, A.; Mayer, Stefan; Pedersen, G.K.

    2005-01-01

    This is a combined experimental and computational study of solitary waves that break on-shore. Velocities and accelerations are measured by a two-camera PIV technique and compared to theoretical values from an Euler model with a VOF method for the free surface. In particular, the dynamics of a so......-called collapsing breaker is scrutinized and the closure between the breaker and the beach is found to be akin to slamming. To the knowledge of the authors, no velocity measurements for this kind of breaker have been previously reported....

  4. Massenmedium Computer: Ein Handbuch für Theorie und Praxis des Deutschunterrichts.

    OpenAIRE

    Kepser, Matthis

    2000-01-01

    Part I gives a critical overview to previous research projects and teaching ideas which concern the computer in German lessons (state in 1999): electronic word processing, databases, programming, classical drill-and-practice software and hypertext learning surroundings, telecomputing, computer as an object and motive of the literature, computers as an aid for the teacher.Part II expands the approaches with a perspective on the computer as a mass media. For that there will be discussed differe...

  5. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer: a population-based study

    Science.gov (United States)

    Fischer, Alexander H.; Wang, Timothy S.; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L.

    2016-01-01

    Background Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit UV exposure. Objective To determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. Methods We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (95% CI), taking into account the complex survey design. Results Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% versus 27.0%; aPOR=1.41; 1.16–1.71), long sleeves (20.5% versus 7.7%; aPOR=1.55; 1.21–1.98), a wide-brimmed hat (26.1% versus 10.5%; aPOR=1.52; 1.24–1.87), and sunscreen (53.7% versus 33.1%; aPOR=2.11; 95% CI=1.73–2.59), but did not have significantly lower odds of recent sunburn (29.7% versus 40.7%; aPOR=0.95; 0.77–1.17). Among subjects with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Limitations Self-reported cross-sectional data and unavailable information quantifying regular sun exposure. Conclusion Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. PMID:27198078

  6. Degeneration in dysplastic hips. A computer tomography study

    DEFF Research Database (Denmark)

    Jacobsen, Steffen; Rømer, Lone; Søballe, Kjeld

    2005-01-01

    BACKGROUND: Hip dysplasia is considered pre-osteoarthritic, causing degeneration in young individuals. OBJECTIVE: To determine the pattern of degenerative change in moderate to severely dysplastic hips in young patients. DESIGN AND PATIENTS: One hundred and ninety-three consecutively......-referred younger patients with hip pain believed to be caused by hip dysplasia constituted the study cohort. The average age was 35.5 years (range, 15-61 years). They were examined by close-cut transverse pelvic and knee computed tomography and antero-posterior radiographs (CT). We identified 197 hips...

  7. Previous experiences and emotional baggage as barriers to lifestyle change - a qualitative study of Norwegian Healthy Life Centre participants.

    Science.gov (United States)

    Følling, Ingrid S; Solbjør, Marit; Helvik, Anne-S

    2015-06-23

    Changing lifestyle is challenging and difficult. The Norwegian Directorate of Health recommends that all municipalities establish Healthy Life Centres targeted to people with lifestyle issues. Little is known about the background, experiences and reflections of participants. More information is needed about participants to shape effective lifestyle interventions with lasting effect. This study explores how participants in a lifestyle intervention programme describe previous life experiences in relation to changing lifestyle. Semi-structured qualitative in-depth interviews were performed with 23 participants (16 women and 7 men) aged 18 - 70 years. The data were analysed using systematic text condensation searching for issues describing participants' responses, and looking for the essence, aiming to share the basis of life-world experiences as valid knowledge. Participants identified two main themes: being stuck in old habits, and being burdened with emotional baggage from their previous negative experiences. Participants expressed a wish to change their lifestyles, but were unable to act in accordance with the health knowledge they possessed. Previous experiences with lifestyle change kept them from initiating attempts without professional assistance. Participants also described being burdened by an emotional baggage with problems from childhood and/or with family, work and social life issues. Respondents said that they felt that emotional baggage was an important explanation for why they were stuck in old habits and that conversely, being stuck in old habits added load to their already emotional baggage and made it heavier. Behavioural change can be hard to perform as psychological distress from life baggage can influence the ability to change. The study participants' experience of being stuck in old habits and having substantial emotional baggage raises questions as to whether or not Healthy Life Centres are able to help participants who need to make a lifestyle

  8. Previous Fractures at Multiple Sites Increase the Risk for Subsequent Fractures: The Global Longitudinal Study of Osteoporosis in Women

    Science.gov (United States)

    Gehlbach, Stephen; Saag, Kenneth G.; Adachi, Jonathan D.; Hooven, Fred H.; Flahive, Julie; Boonen, Steven; Chapurlat, Roland D.; Compston, Juliet E.; Cooper, Cyrus; Díez-Perez, Adolfo; Greenspan, Susan L.; LaCroix, Andrea Z.; Netelenbos, J. Coen; Pfeilschifter, Johannes; Rossini, Maurizio; Roux, Christian; Sambrook, Philip N.; Silverman, Stuart; Siris, Ethel S.; Watts, Nelson B.; Lindsay, Robert

    2016-01-01

    Previous fractures of the hip, spine, or wrist are well-recognized predictors of future fracture, but the role of other fracture sites is less clear. We sought to assess the relationship between prior fracture at 10 skeletal locations and incident fracture. The Global Longitudinal Study of Osteoporosis in Women (GLOW) is an observational cohort study being conducted in 17 physician practices in 10 countries. Women ≥ 55 years answered questionnaires at baseline and at 1 and/or 2 years (fractures in previous year). Of 60,393 women enrolled, follow-up data were available for 51,762. Of these, 17.6%, 4.0%, and 1.6% had suffered 1, 2, or ≥3 fractures since age 45. During the first 2 years of follow-up, 3149 women suffered 3683 incident fractures. Compared with women with no prior fractures, women with 1, 2, or ≥ 3 prior fractures were 1.8-, 3.0-, and 4.8-fold more likely to have any incident fracture; those with ≥3 prior fractures were 9.1-fold more likely to sustain a new vertebral fracture. Nine of 10 prior fracture locations were associated with an incident fracture. The strongest predictors of incident spine and hip fractures were prior spine fracture (hazard ratio 7.3) and hip (hazard ratio 3.5). Prior rib fractures were associated with a 2.3-fold risk of subsequent vertebral fracture, previous upper leg fracture predicted a 2.2-fold increased risk of hip fracture; women with a history of ankle fracture were at 1.8-fold risk of future fracture of a weight-bearing bone. Our findings suggest that a broad range of prior fracture sites are associated with an increased risk of incident fractures, with important implications for clinical assessments and risk model development. PMID:22113888

  9. A study on measurement of scattery ray of computed tomography

    International Nuclear Information System (INIS)

    Cho, Pyong Kon; Lee, Joon Hyup; Kim, Yoon Sik; Lee, Chang Yeop

    2003-01-01

    Computed tomographic equipment is essential for diagnosis by means of radiation. With passage of time and development of science computed tomographic was developed time and again and in future examination by means of this equipment is expected to increase. In this connection these authors measured rate of scatter ray generation at front of lead glass for patients within control room of computed tomographic equipment room and outside of entrance door for exit and entrance of patients and attempted to find out method for minimizing exposure to scatter ray. From November 2001 twenty five units of computed tomographic equipment which were already installed and operation by 13 general hospitals and university hospitals in Seoul were subjected to this study. As condition of photographing those recommended by manufacturer for measuring exposure to scatter ray was use. At the time objects used DALI CT Radiation Dose Test Phantom fot Head (φ 16 cm Plexglas) and Phantom for Stomache (φ 32 cm Plexglas) were used. For measurement of scatter ray Reader (Radiation Monitor Controller Model 2026) and G-M Survey were used to Survey Meter of Radical Corporation, model 20 x 5-1800, Electrometer/Ion Chamber, S/N 21740. Spots for measurement of scatter ray included front of lead glass for patients within control room of computed tomographic equipment room which is place where most of work by gradiographic personnel are carried out and is outside of entrance door for exit and entrance of patients and their guardians and at spot 100 cm off from isocenter at the time of scanning the object. Work environment within computed tomography room which was installed and under operation by each hospital showed considerable difference depending on circumstances of pertinent hospitals and status of scatter ray was as follows. 1) From isocenter of computed tomographic equipment to lead glass for patients within control room average distance was 377 cm. At that time scatter ray showed diverse

  10. A case study on support for students' thinking through computer-mediated communication.

    Science.gov (United States)

    Sannomiya, M; Kawaguchi, A

    2000-08-01

    This is a case study on support for thinking through computer-mediated communication. Two graduate students were supervised in their research using computer-mediated communication, which was asynchronous and written; the supervisor was not present. The students' reports pointed out there was more planning and editing and low interactivity in this approach relative to face-to-face communication. These attributes were confirmed by their supervisor's report. The students also suggested that the latter was effective in support of a production stage of thinking in research, while the former approach was effective in support of examination of thinking. For distance education to be successful, an appropriate combination of communication media must consider students' thinking stages. Finally, transient and permanent effects should be discriminated in computer-mediated communication.

  11. Computational study of duct and pipe flows using the method of pseudocompressibility

    Science.gov (United States)

    Williams, Robert W.

    1991-01-01

    A viscous, three-dimensional, incompressible, Navier-Stokes Computational Fluid Dynamics code employing pseudocompressibility is used for the prediction of laminar primary and secondary flows in two 90-degree bends of constant cross section. Under study are a square cross section duct bend with 2.3 radius ratio and a round cross section pipe bend with 2.8 radius ratio. Sensitivity of predicted primary and secondary flow to inlet boundary conditions, grid resolution, and code convergence is investigated. Contour and velocity versus spanwise coordinate plots comparing prediction to experimental data flow components are shown at several streamwise stations before, within, and after the duct and pipe bends. Discussion includes secondary flow physics, computational method, computational requirements, grid dependence, and convergence rates.

  12. Two-Cloud-Servers-Assisted Secure Outsourcing Multiparty Computation

    Science.gov (United States)

    Wen, Qiaoyan; Zhang, Hua; Jin, Zhengping; Li, Wenmin

    2014-01-01

    We focus on how to securely outsource computation task to the cloud and propose a secure outsourcing multiparty computation protocol on lattice-based encrypted data in two-cloud-servers scenario. Our main idea is to transform the outsourced data respectively encrypted by different users' public keys to the ones that are encrypted by the same two private keys of the two assisted servers so that it is feasible to operate on the transformed ciphertexts to compute an encrypted result following the function to be computed. In order to keep the privacy of the result, the two servers cooperatively produce a custom-made result for each user that is authorized to get the result so that all authorized users can recover the desired result while other unauthorized ones including the two servers cannot. Compared with previous research, our protocol is completely noninteractive between any users, and both of the computation and the communication complexities of each user in our solution are independent of the computing function. PMID:24982949

  13. Two-cloud-servers-assisted secure outsourcing multiparty computation.

    Science.gov (United States)

    Sun, Yi; Wen, Qiaoyan; Zhang, Yudong; Zhang, Hua; Jin, Zhengping; Li, Wenmin

    2014-01-01

    We focus on how to securely outsource computation task to the cloud and propose a secure outsourcing multiparty computation protocol on lattice-based encrypted data in two-cloud-servers scenario. Our main idea is to transform the outsourced data respectively encrypted by different users' public keys to the ones that are encrypted by the same two private keys of the two assisted servers so that it is feasible to operate on the transformed ciphertexts to compute an encrypted result following the function to be computed. In order to keep the privacy of the result, the two servers cooperatively produce a custom-made result for each user that is authorized to get the result so that all authorized users can recover the desired result while other unauthorized ones including the two servers cannot. Compared with previous research, our protocol is completely noninteractive between any users, and both of the computation and the communication complexities of each user in our solution are independent of the computing function.

  14. Modeling an Excitable Biosynthetic Tissue with Inherent Variability for Paired Computational-Experimental Studies.

    Directory of Open Access Journals (Sweden)

    Tanmay A Gokhale

    2017-01-01

    Full Text Available To understand how excitable tissues give rise to arrhythmias, it is crucially necessary to understand the electrical dynamics of cells in the context of their environment. Multicellular monolayer cultures have proven useful for investigating arrhythmias and other conduction anomalies, and because of their relatively simple structure, these constructs lend themselves to paired computational studies that often help elucidate mechanisms of the observed behavior. However, tissue cultures of cardiomyocyte monolayers currently require the use of neonatal cells with ionic properties that change rapidly during development and have thus been poorly characterized and modeled to date. Recently, Kirkton and Bursac demonstrated the ability to create biosynthetic excitable tissues from genetically engineered and immortalized HEK293 cells with well-characterized electrical properties and the ability to propagate action potentials. In this study, we developed and validated a computational model of these excitable HEK293 cells (called "Ex293" cells using existing electrophysiological data and a genetic search algorithm. In order to reproduce not only the mean but also the variability of experimental observations, we examined what sources of variation were required in the computational model. Random cell-to-cell and inter-monolayer variation in both ionic conductances and tissue conductivity was necessary to explain the experimentally observed variability in action potential shape and macroscopic conduction, and the spatial organization of cell-to-cell conductance variation was found to not impact macroscopic behavior; the resulting model accurately reproduces both normal and drug-modified conduction behavior. The development of a computational Ex293 cell and tissue model provides a novel framework to perform paired computational-experimental studies to study normal and abnormal conduction in multidimensional excitable tissue, and the methodology of modeling

  15. Effects of acupuncture and computer-assisted cognitive training for post-stroke attention deficits: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Huang, Jia; McCaskey, Michael A; Yang, Shanli; Ye, Haicheng; Tao, Jing; Jiang, Cai; Schuster-Amft, Corina; Balzer, Christian; Ettlin, Thierry; Schupp, Wilfried; Kulke, Hartwig; Chen, Lidian

    2015-12-02

    A majority of stroke survivors present with cognitive impairments. Attention disturbance, which leads to impaired concentration and overall reduced cognitive functions, is strongly associated with stroke. The clinical efficacy of acupuncture with Baihui (GV20) and Shenting (GV24) as well as computer-assisted cognitive training in stroke and post-stroke cognitive impairment have both been demonstrated in previous studies. To date, no systematic comparison of these exists and the potential beneficial effects of a combined application are yet to be examined. The main objective of this pilot study is to evaluate the effects of computer-assisted cognitive training compared to acupuncture on the outcomes of attention assessments. The second objective is to test the effects of a combined cognitive intervention that incorporates computer-assisted cognitive training and acupuncture (ACoTrain). An international multicentre, single-blinded, randomised controlled pilot trial will be conducted. In a 1:1:1 ratio, 60 inpatients with post-stroke cognitive dysfunction will be randomly allocated into either the acupuncture group, the computer-assisted cognitive training group, or the ACoTrain group in addition to their individual rehabilitation programme. The intervention period of this pilot trial will last 4 weeks (30 minutes per day, 5 days per week, Monday to Friday). The primary outcome is the test battery for attentional performance. The secondary outcomes include the Trail Making Test, Test des Deux Barrages, National Institute of Health Stroke Scale, and Modified Barthel Index for assessment of daily life competence, and the EuroQol Questionnaire for health-related quality of life. This trial mainly focuses on evaluating the effects of computer-assisted cognitive training compared to acupuncture on the outcomes of attention assessments. The results of this pilot trial are expected to provide new insights on how Eastern and Western medicine can complement one another and

  16. Central venous stenosis among hemodialysis patients is often not associated with previous central venous catheters.

    Science.gov (United States)

    Kotoda, Atsushi; Akimoto, Tetsu; Kato, Maki; Kanazawa, Hidenori; Nakata, Manabu; Sugase, Taro; Ogura, Manabu; Ito, Chiharu; Sugimoto, Hideharu; Muto, Shigeaki; Kusano, Eiji

    2011-01-01

    It is widely assumed that central venous stenosis (CVS) is most commonly associated with previous central venous catheterization among the chronic hemodialysis (HD) patients. We evaluated the validity of this assumption in this retrospective study. The clinical records from 2,856 consecutive HD patients with vascular access failure during a 5-year period were reviewed, and a total of 26 patients with symptomatic CVS were identified. Combined with radiological findings, their clinical characteristics were examined. Only seven patients had a history of internal jugular dialysis catheterization. Diagnostic multidetector row computed tomography angiography showed that 7 of the 19 patients with no history of catheterization had left innominate vein stenosis due to extrinsic compression between the sternum and arch vessels. These patients had a shorter period from the time of creation of the vascular access to the initial referral (9.2 ± 7.6 months) than the rest of the patients (35.5 ± 18.6 months, p = 0.0017). Our findings suggest that cases without a history of central venous catheterization may not be rare among the HD patients with symptomatic CVS. However, those still need to be confirm by larger prospective studies of overall chronic HD patients with symptomatic CVS.

  17. APPLICATIONS OF CLOUD COMPUTING SERVICES IN EDUCATION – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Tomasz Cieplak

    2014-11-01

    Full Text Available Applications of Cloud Computing in enterprises are very wide-ranging. In opposition, educational applications of Cloud Computing in Poland are someway limited. On the other hand, young people use services of Cloud Computing frequently. Utilization of Facebook, Google or other services in Poland by young people is almost the same as in Western Europe or in the USA. Taking into account those considerations, few years ago authors have started process of popularization and usage of Cloud Computing educational services in their professional work. This article briefly summarizes authors’ experience with selected and most popular Cloud Computing services.

  18. From Three-Photon Greenberger-Horne-Zeilinger States to Ballistic Universal Quantum Computation.

    Science.gov (United States)

    Gimeno-Segovia, Mercedes; Shadbolt, Pete; Browne, Dan E; Rudolph, Terry

    2015-07-10

    Single photons, manipulated using integrated linear optics, constitute a promising platform for universal quantum computation. A series of increasingly efficient proposals have shown linear-optical quantum computing to be formally scalable. However, existing schemes typically require extensive adaptive switching, which is experimentally challenging and noisy, thousands of photon sources per renormalized qubit, and/or large quantum memories for repeat-until-success strategies. Our work overcomes all these problems. We present a scheme to construct a cluster state universal for quantum computation, which uses no adaptive switching, no large memories, and which is at least an order of magnitude more resource efficient than previous passive schemes. Unlike previous proposals, it is constructed entirely from loss-detecting gates and offers a robustness to photon loss. Even without the use of an active loss-tolerant encoding, our scheme naturally tolerates a total loss rate ∼1.6% in the photons detected in the gates. This scheme uses only 3 Greenberger-Horne-Zeilinger states as a resource, together with a passive linear-optical network. We fully describe and model the iterative process of cluster generation, including photon loss and gate failure. This demonstrates that building a linear-optical quantum computer needs to be less challenging than previously thought.

  19. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  20. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  1. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Directory of Open Access Journals (Sweden)

    Carolina Milesi

    2017-04-01

    Full Text Available While the underrepresentation of women in the fast-growing STEM field of computer science (CS has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  2. Body dynamics and hydrodynamics of swimming larvae: a computational study

    NARCIS (Netherlands)

    Li, G.; Müller, U.K.; Leeuwen, van J.L.; Liu, H.

    2012-01-01

    To understand the mechanics of fish swimming, we need to know the forces exerted by the fluid and how these forces affect the motion of the fish. To this end, we developed a 3-D computational approach that integrates hydrodynamics and body dynamics. This study quantifies the flow around a swimming

  3. A comparative study of attenuation correction algorithms in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Murase, Kenya; Itoh, Hisao; Mogami, Hiroshi; Ishine, Masashiro; Kawamura, Masashi; Iio, Atsushi; Hamamoto, Ken

    1987-01-01

    A computer based simulation method was developed to assess the relative effectiveness and availability of various attenuation compensation algorithms in single photon emission computed tomography (SPECT). The effect of the nonuniformity of attenuation coefficient distribution in the body, the errors in determining a body contour and the statistical noise on reconstruction accuracy and the computation time in using the algorithms were studied. The algorithms were classified into three groups: precorrection, post correction and iterative correction methods. Furthermore, a hybrid method was devised by combining several methods. This study will be useful for understanding the characteristics limitations and strengths of the algorithms and searching for a practical correction method for photon attenuation in SPECT. (orig.)

  4. Self-study manual for introduction to computational fluid dynamics

    OpenAIRE

    Nabatov, Andrey

    2017-01-01

    Computational Fluid Dynamics (CFD) is the branch of Fluid Mechanics and Computational Physics that plays a decent role in modern Mechanical Engineering Design process due to such advantages as relatively low cost of simulation comparing with conduction of real experiment, an opportunity to easily correct the design of a prototype prior to manufacturing of the final product and a wide range of application: mixing, acoustics, cooling and aerodynamics. This makes CFD particularly and Computation...

  5. Calorimetric and computational studies for three nitroimidazole isomers

    International Nuclear Information System (INIS)

    Carvalho, Tânia M.T.; Amaral, Luísa M.P.F.; Morais, Victor M.F.; Ribeiro da Silva, Maria D.M.C.

    2017-01-01

    Highlights: • Energy of combustion of 4-nitroimidazole was measured by static bomb calorimetry. • Enthalpy of sublimation of 4-nitroimidazole was determined by Calvet microcalorimetry. • Gas-phase enthalpy of formation of 4-nitroimidazole derived from experimental measurements. • Gas-phase enthalpies of nitroimidazole isomers formation estimated from G3 calculations. - Abstract: In the present work, a combined experimental and computational thermochemical study of nitroimidazole isomers was carried out. The standard (p° = 0.1 MPa) molar enthalpy of combustion, in the crystalline phase, for 4-nitroimidazole was determined, at the temperature of 298.15 K, using a static bomb combustion calorimeter. Calvet microcalorimetry experiments were performed to measure its standard molar enthalpy of sublimation. The standard molar enthalpy of formation of 4-nitroimidazole, in the gaseous phase, at T = 298.15 K, (116.9 ± 2.9) kJ·mol −1 , has been derived from the corresponding standard molar enthalpy of formation in the crystalline phase and the standard molar enthalpy of sublimation. Computational studies for 4-nitroimidazole were performed to complement the experimental work. These were also extended to the 2- and 5-nitroimidazole isomers. The gas-phase enthalpies of formation were estimated from high level ab initio molecular orbital calculations, at the G3 level. Also investigated were the tautomeric equilibrium of 4(5)-nitroimidazole in the gaseous phase and it was concluded that the two tautomers are equally stable.

  6. Previous dropout from diabetic care as a predictor of patients' willingness to use mobile applications for self-management: A cross-sectional study.

    Science.gov (United States)

    Yamaguchi, Satoko; Waki, Kayo; Tomizawa, Nobuko; Waki, Hironori; Nannya, Yasuhito; Nangaku, Masaomi; Kadowaki, Takashi; Ohe, Kazuhiko

    2017-07-01

    Preventing dropout is crucial in managing diabetes. Accordingly, we investigated whether patients who had dropped out of diabetic care are suitable candidates for the use of mobile technologies - such as smartphone applications - to support self-management (mHealth), which might help prevent dropout. We carried out a cross-sectional study in Tokyo, Japan. Patients aged 20 years or older who were clinically diagnosed as diabetic and who regularly visited the outpatient unit at the University of Tokyo Hospital were recruited between August 2014 and March 2015. Data were collected through face-to-face structured interviews, physical measurements and medical records. Participants were asked whether they were willing to use mHealth after being shown DialBetics - an mHealth application for diabetics - as an example, and about their history of dropout and previous mHealth experience. Data were analyzed by multivariate logistic regression models. Of 307 patients with type 1 and type 2 diabetes, 34 (11.1%) had previously dropped out from diabetic care. Multivariate analysis identified previous mHealth experience as a negative predictor of dropout (odds ratio 0.211, P = 0.023). Of those 34 patients, 27 (79.4%) expressed willingness to use mHealth, a significantly higher percentage than for those who had never dropped out (51.5%, P = 0.002). After adjusting for confounders, history of dropout remained a strong predictor of willingness (odds ratio 3.870, P = 0.004). Patients who previously dropped out of diabetic care are suitable candidates for mHealth. Future studies must evaluate whether mHealth is effective for preventing repeated dropout and improving glycemic control among this population. © 2016 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.

  7. Cavity-assisted quantum computing in a silicon nanostructure

    International Nuclear Information System (INIS)

    Tang Bao; Qin Hao; Zhang Rong; Xue Peng; Liu Jin-Ming

    2014-01-01

    We present a scheme of quantum computing with charge qubits corresponding to one excess electron shared between dangling-bond pairs of surface silicon atoms that couple to a microwave stripline resonator on a chip. By choosing a certain evolution time, we propose the realization of a set of universal single- and two-qubit logical gates. Due to its intrinsic stability and scalability, the silicon dangling-bond charge qubit can be regarded as one of the most promising candidates for quantum computation. Compared to the previous schemes on quantum computing with silicon bulk systems, our scheme shows such advantages as a long coherent time and direct control and readout. (general)

  8. Application of computational methods in genetic study of inflammatory bowel disease.

    Science.gov (United States)

    Li, Jin; Wei, Zhi; Hakonarson, Hakon

    2016-01-21

    Genetic factors play an important role in the etiology of inflammatory bowel disease (IBD). The launch of genome-wide association study (GWAS) represents a landmark in the genetic study of human complex disease. Concurrently, computational methods have undergone rapid development during the past a few years, which led to the identification of numerous disease susceptibility loci. IBD is one of the successful examples of GWAS and related analyses. A total of 163 genetic loci and multiple signaling pathways have been identified to be associated with IBD. Pleiotropic effects were found for many of these loci; and risk prediction models were built based on a broad spectrum of genetic variants. Important gene-gene, gene-environment interactions and key contributions of gut microbiome are being discovered. Here we will review the different types of analyses that have been applied to IBD genetic study, discuss the computational methods for each type of analysis, and summarize the discoveries made in IBD research with the application of these methods.

  9. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  10. Solvent-driven symmetry of self-assembled nanocrystal superlattices-A computational study

    KAUST Repository

    Kaushik, Ananth P.; Clancy, Paulette

    2012-01-01

    used solvents, toluene and hexane. System sizes in the 400,000-500,000-atom scale followed for nanoseconds are required for this computationally intensive study. The key questions addressed here concern the thermodynamic stability of the superlattice

  11. Thermochemistry of 6-propyl-2-thiouracil: An experimental and computational study

    Energy Technology Data Exchange (ETDEWEB)

    Szterner, Piotr; Galvão, Tiago L.P.; Amaral, Luísa M.P.F.; Ribeiro da Silva, Maria D.M.C., E-mail: mdsilva@fc.up.pt; Ribeiro da Silva, Manuel A.V.

    2014-07-01

    Highlights: • Thermochemistry of 6-propyl-2-thiouracil – experimental and computational study. • Vapor pressure study of the 6-propyl-2-thiouracil by Knudsen effusion technique. • Enthalpies of formation of 6-propyl-2-thiouracil by rotating combustion calorimetry. • Accurate computational calculations (G3 and G4 composite methods) were performed. - Abstract: The standard (p{sup o} = 0.1 MPa) molar enthalpy of formation of 6-propyl-2-thiouracil was derived from its standard molar energy of combustion, in oxygen, to yield CO{sub 2} (g), N{sub 2} (g) and H{sub 2}SO{sub 4}·115H{sub 2}O (l), at T = 298.15 K, measured by rotating bomb combustion calorimetry. The vapor pressures as function of temperature were measured by the Knudsen effusion technique and the standard molar enthalpy of sublimation, Δ{sub cr}{sup g}H{sub m}{sup o}, at T = 298.15 K, was derived by the Clausius–Clapeyron equation. These two thermodynamic parameters yielded the standard molar enthalpy of formation, in the gaseous phase, at T = 298.15 K: −(142.5 ± 1.9) kJ mol{sup −1}. This value was compared with estimates obtained from very accurate computational calculations using the G3 and G4 composite methods.

  12. Predicting the Pullout Capacity of Small Ground Anchors Using Nonlinear Integrated Computing Techniques

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study investigates predicting the pullout capacity of small ground anchors using nonlinear computing techniques. The input-output prediction model for the nonlinear Hammerstein-Wiener (NHW and delay inputs for the adaptive neurofuzzy inference system (DANFIS are developed and utilized to predict the pullout capacity. The results of the developed models are compared with previous studies that used artificial neural networks and least square support vector machine techniques for the same case study. The in situ data collection and statistical performances are used to evaluate the models performance. Results show that the developed models enhance the precision of predicting the pullout capacity when compared with previous studies. Also, the DANFIS model performance is proven to be better than other models used to detect the pullout capacity of ground anchors.

  13. Improving Undergraduates' Critique via Computer Mediated Communication

    Science.gov (United States)

    Mohamad, Maslawati; Musa, Faridah; Amin, Maryam Mohamed; Mufti, Norlaila; Latiff, Rozmel Abdul; Sallihuddin, Nani Rahayu

    2014-01-01

    Our current university students, labeled as "Generation Y" or Millennials, are different from previous generations due to wide exposure to media. Being technologically savvy, they are accustomed to Internet for information and social media for socializing. In line with this current trend, teaching through computer mediated communication…

  14. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    Science.gov (United States)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  15. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  16. Dealing with media distractions: An observational study of computer-based multitasking among children and adults in the Netherlands

    NARCIS (Netherlands)

    Baumgartner, S.E.; Sumter, S.R.

    2017-01-01

    The aim of this observational study was to investigate differences in computer-based multitasking among children and adults. Moreover, the study investigated how attention problems are related to computer-based multitasking and how these individual differences interact with age. Computer-based

  17. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  18. FISS: a computer program for reactor systems studies

    International Nuclear Information System (INIS)

    Tamm, H.; Sherman, G.R.; Wright, J.H.; Nieman, R.E.

    1979-08-01

    ΣFISSΣ is a computer code for use in investigating alternative fuel cycle strategies for Canadian and world nuclear programs. The code performs a system simulation accounting for dynamic effects of growing nuclear systems. Facilities in the model include storage for irradiated fuel, mines, plants for enrichment, fuel fabrication, fuel reprocessing and heavy water, and reactors. FISS is particularly useful for comparing various reactor strategies and studying sensitivities of resource consumption, capital investment and energy costs with changes in fuel cycle parameters, reactor parameters and financial variables. (author)

  19. Spatiotemporal Data Mining: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shashi Shekhar

    2015-10-01

    Full Text Available Explosive growth in geospatial and temporal data as well as the emergence of new technologies emphasize the need for automated discovery of spatiotemporal knowledge. Spatiotemporal data mining studies the process of discovering interesting and previously unknown, but potentially useful patterns from large spatiotemporal databases. It has broad application domains including ecology and environmental management, public safety, transportation, earth science, epidemiology, and climatology. The complexity of spatiotemporal data and intrinsic relationships limits the usefulness of conventional data science techniques for extracting spatiotemporal patterns. In this survey, we review recent computational techniques and tools in spatiotemporal data mining, focusing on several major pattern families: spatiotemporal outlier, spatiotemporal coupling and tele-coupling, spatiotemporal prediction, spatiotemporal partitioning and summarization, spatiotemporal hotspots, and change detection. Compared with other surveys in the literature, this paper emphasizes the statistical foundations of spatiotemporal data mining and provides comprehensive coverage of computational approaches for various pattern families. ISPRS Int. J. Geo-Inf. 2015, 4 2307 We also list popular software tools for spatiotemporal data analysis. The survey concludes with a look at future research needs.

  20. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  1. BOINC service for volunteer cloud computing

    CERN Document Server

    Høimyr, N; Buncic, P; Giovannozzi, M; Gonzalez, A; Harutyunyan, A; Jones, P L; Karneyeu, A; Marquina, M A; Mcintosh, E; Segal, B; Skands, P; Grey, F; Lombraña González, D; Zacharov, I; CERN. Geneva. IT Department

    2012-01-01

    Since a couple of years, a team at CERN and partners from the Citizen Cyberscience Centre (CCC) have been working on a project that enables general physics simulation programs to run in a virtual machine on volunteer PCs around the world. The project uses the Berkeley Open Infrastructure for Network Computing (BOINC) framework. Based on CERNVM and the job management framework Co-Pilot, this project was made available for public beta-testing in August 2011 with Monte Carlo simulations of LHC physics under the name "LHC@home 2.0" and the BOINC project: "Test4Theory". At the same time, CERN's efforts on Volunteer Computing for LHC machine studies have been intensified; this project has previously been known as LHC@home, and has been running the "Sixtrack" beam dynamics application for the LHC accelerator, using a classic BOINC framework without virtual machines. CERN-IT has set up a BOINC server cluster, and has provided and supported the BOINC infrastructure for both projects. CERN intends to evolve the setup i...

  2. Previous utilization of service does not improve timely booking in ...

    African Journals Online (AJOL)

    Previous utilization of service does not improve timely booking in antenatal care: Cross sectional study ... Journal Home > Vol 24, No 3 (2010) > ... Results: Past experience on antenatal care service utilization did not come out as a predictor for ...

  3. A Case Study of Educational Computer Game Design by Middle School Students

    Science.gov (United States)

    An, Yun-Jo

    2016-01-01

    Only a limited number of research studies have investigated how students design educational computer games and its impact on student learning. In addition, most studies on educational game design by students were conducted in the areas of mathematics and science. Using the qualitative case study approach, this study explored how seventh graders…

  4. Impact of previously disadvantaged land-users on sustainable ...

    African Journals Online (AJOL)

    Impact of previously disadvantaged land-users on sustainable agricultural ... about previously disadvantaged land users involved in communal farming systems ... of input, capital, marketing, information and land use planning, with effect on ...

  5. Diagnostic performance of combined noninvasive coronary angiography and myocardial perfusion imaging using 320 row detector computed tomography

    DEFF Research Database (Denmark)

    Vavere, Andrea L; Simon, Gregory G; George, Richard T

    2013-01-01

    Multidetector coronary computed tomography angiography (CTA) is a promising modality for widespread clinical application because of its noninvasive nature and high diagnostic accuracy as found in previous studies using 64 to 320 simultaneous detector rows. It is, however, limited in its ability...... to detect myocardial ischemia. In this article, we describe the design of the CORE320 study ("Combined coronary atherosclerosis and myocardial perfusion evaluation using 320 detector row computed tomography"). This prospective, multicenter, multinational study is unique in that it is designed to assess...... the diagnostic performance of combined 320-row CTA and myocardial CT perfusion imaging (CTP) in comparison with the combination of invasive coronary angiography and single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI). The trial is being performed at 16 medical centers located in 8...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. Comparison of Computational Algorithms for the Classification of Liver Cancer using SELDI Mass Spectrometry: A Case Study

    Directory of Open Access Journals (Sweden)

    Robert J Hickey

    2007-01-01

    Full Text Available Introduction: As an alternative to DNA microarrays, mass spectrometry based analysis of proteomic patterns has shown great potential in cancer diagnosis. The ultimate application of this technique in clinical settings relies on the advancement of the technology itself and the maturity of the computational tools used to analyze the data. A number of computational algorithms constructed on different principles are available for the classification of disease status based on proteomic patterns. Nevertheless, few studies have addressed the difference in the performance of these approaches. In this report, we describe a comparative case study on the classification accuracy of hepatocellular carcinoma based on the serum proteomic pattern generated from a Surface Enhanced Laser Desorption/Ionization (SELDI mass spectrometer.Methods: Nine supervised classifi cation algorithms are implemented in R software and compared for the classification accuracy.Results: We found that the support vector machine with radial function is preferable as a tool for classification of hepatocellular carcinoma using features in SELDI mass spectra. Among the rest of the methods, random forest and prediction analysis of microarrays have better performance. A permutation-based technique reveals that the support vector machine with a radial function seems intrinsically superior in learning from the training data since it has a lower prediction error than others when there is essentially no differential signal. On the other hand, the performance of the random forest and prediction analysis of microarrays rely on their capability of capturing the signals with substantial differentiation between groups.Conclusions: Our finding is similar to a previous study, where classification methods based on the Matrix Assisted Laser Desorption/Ionization (MALDI mass spectrometry are compared for the prediction accuracy of ovarian cancer. The support vector machine, random forest and prediction

  8. Response to deep TMS in depressive patients with previous electroconvulsive treatment.

    Science.gov (United States)

    Rosenberg, Oded; Zangen, Abraham; Stryjer, Rafael; Kotler, Moshe; Dannon, Pinhas N

    2010-10-01

    The efficacy of transcranial magnetic stimulation (TMS) in the treatment of major depression has already been shown. Novel TMS coils allowing stimulation of deeper brain regions have recently been developed and studied. Our study is aimed at exploring the possible efficacy of deep TMS in patients with resistant depression, who previously underwent electroconvalsive therapy (ECT). Using Brainsway's deep TMS H1 coil, six patients who previously underwent ECT, were treated with 120% power of the motor threshold at a frequency of 20 Hz. Patients underwent five sessions per week, up to 4 weeks. Before the study, patients were evaluated using the Hamilton depression rating scale (HDRS, 24 items), the Hamilton anxiety scale, and the Beck depression inventory and were again evaluated after 5, 10, 15, and 20 daily treatments. Response to treatment was considered a reduction in the HDRS of at least 50%, and remission was considered a reduction of the HDRS-24 below 10 points. Two of six patients responded to the treatment with deep TMS, including one who achieved full remission. Our results suggest the possibility of a subpopulation of depressed patients who may benefit from deep TMS treatment, including patients who did not respond to ECT previously. However, the power of the study is small and similar larger samples are needed. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Influence of previous knowledge in Torrance tests of creative thinking

    OpenAIRE

    Aranguren, María; Consejo Nacional de Investigaciones Científicas y Técnicas CONICET

    2015-01-01

    The aim of this work is to analyze the influence of study field, expertise and recreational activities participation in Torrance Tests of Creative Thinking (TTCT, 1974) performance. Several hypotheses were postulated to explore the possible effects of previous knowledge in TTCT verbal and TTCT figural university students’ outcomes. Participants in this study included 418 students from five study fields: Psychology;Philosophy and Literature, Music; Engineering; and Journalism and Advertisin...

  10. ‘Shift’ ‘n ‘control’: The computer as a third interactant in Spanish-language

    Science.gov (United States)

    Goble, Ryan; Vickers, Caroline H

    2015-01-01

    The purpose of this paper is to examine the role of the computer in medical consultations in which English- Spanish-bilingual medical providers interact with Spanish-monolingual patients. Following previous studies that have revealed that the presence of the computer in consultations detracts from direct provider– patient communication, we pay specific attention to how the use of the computer in Spanish-language medical consultations can complement or adversely affect the co-construction of the patient’s health narrative. The data for the present study consist of 36 Spanish-language medical consultations in Southern California. Applying a conversation analytical approach to the health narratives in the corpus, we argue that the computer is essentially a third interactant to which medical providers orient through lowered volume, minimal responses, bureaucratic side talk, and, most importantly, code-switching to English – all of which strip the patients of control over the co-construction of their health narrative with their medical provider. Because the patient does not have access to the computational task and the language, we posit that this exacerbates the already existing adverse effects that the computer has on provider–patient interaction.

  11. Deep Learning for Computer Vision: A Brief Review

    Science.gov (United States)

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  12. Deep Learning for Computer Vision: A Brief Review

    Directory of Open Access Journals (Sweden)

    Athanasios Voulodimos

    2018-01-01

    Full Text Available Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.

  13. Deep Learning for Computer Vision: A Brief Review.

    Science.gov (United States)

    Voulodimos, Athanasios; Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.

  14. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    Science.gov (United States)

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  15. Cloud Computing: A study of cloud architecture and its patterns

    OpenAIRE

    Mandeep Handa,; Shriya Sharma

    2015-01-01

    Cloud computing is a general term for anything that involves delivering hosted services over the Internet. Cloud computing is a paradigm shift following the shift from mainframe to client–server in the early 1980s. Cloud computing can be defined as accessing third party software and services on web and paying as per usage. It facilitates scalability and virtualized resources over Internet as a service providing cost effective and scalable solution to customers. Cloud computing has...

  16. Nanostructured interfaces for enhancing mechanical properties of composites: Computational micromechanical studies

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon

    2015-01-01

    Computational micromechanical studies of the effect of nanostructuring and nanoengineering of interfaces, phase and grain boundaries of materials on the mechanical properties and strength of materials and the potential of interface nanostructuring to enhance the materials properties are reviewed....

  17. Basicities of Strong Bases in Water: A Computational Study

    OpenAIRE

    Kaupmees, Karl; Trummal, Aleksander; Leito, Ivo

    2014-01-01

    Aqueous pKa values of strong organic bases – DBU, TBD, MTBD, different phosphazene bases, etc – were computed with CPCM, SMD and COSMO-RS approaches. Explicit solvent molecules were not used. Direct computations and computations with reference pKa values were used. The latter were of two types: (1) reliable experimental aqueous pKa value of a reference base with structure similar to the investigated base or (2) reliable experimental pKa value in acetonitrile of the investigated base itself. ...

  18. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  19. The Critical Exponent is Computable for Automatic Sequences

    Directory of Open Access Journals (Sweden)

    Jeffrey Shallit

    2011-08-01

    Full Text Available The critical exponent of an infinite word is defined to be the supremum of the exponent of each of its factors. For k-automatic sequences, we show that this critical exponent is always either a rational number or infinite, and its value is computable. This generalizes or recovers previous results of Krieger and others. Our technique is applicable to other situations; e.g., the computation of the optimal recurrence constant for a linearly recurrent k-automatic sequence.

  20. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.