WorldWideScience

Sample records for computational studies reveal

  1. Integration of computational modeling with membrane transport studies reveals new insights into amino acid exchange transport mechanisms

    Science.gov (United States)

    Widdows, Kate L.; Panitchob, Nuttanont; Crocker, Ian P.; Please, Colin P.; Hanson, Mark A.; Sibley, Colin P.; Johnstone, Edward D.; Sengers, Bram G.; Lewis, Rohan M.; Glazier, Jocelyn D.

    2015-01-01

    Uptake of system L amino acid substrates into isolated placental plasma membrane vesicles in the absence of opposing side amino acid (zero-trans uptake) is incompatible with the concept of obligatory exchange, where influx of amino acid is coupled to efflux. We therefore hypothesized that system L amino acid exchange transporters are not fully obligatory and/or that amino acids are initially present inside the vesicles. To address this, we combined computational modeling with vesicle transport assays and transporter localization studies to investigate the mechanisms mediating [14C]l-serine (a system L substrate) transport into human placental microvillous plasma membrane (MVM) vesicles. The carrier model provided a quantitative framework to test the 2 hypotheses that l-serine transport occurs by either obligate exchange or nonobligate exchange coupled with facilitated transport (mixed transport model). The computational model could only account for experimental [14C]l-serine uptake data when the transporter was not exclusively in exchange mode, best described by the mixed transport model. MVM vesicle isolates contained endogenous amino acids allowing for potential contribution to zero-trans uptake. Both L-type amino acid transporter (LAT)1 and LAT2 subtypes of system L were distributed to MVM, with l-serine transport attributed to LAT2. These findings suggest that exchange transporters do not function exclusively as obligate exchangers.—Widdows, K. L., Panitchob, N., Crocker, I. P., Please, C. P., Hanson, M. A., Sibley, C. P., Johnstone, E. D., Sengers, B. G., Lewis, R. M., Glazier, J. D. Integration of computational modeling with membrane transport studies reveals new insights into amino acid exchange transport mechanisms. PMID:25761365

  2. Spontaneous Movements of a Computer Mouse Reveal Egoism and In-group Favoritism

    Science.gov (United States)

    Maliszewski, Norbert; Wojciechowski, Łukasz; Suszek, Hubert

    2017-01-01

    The purpose of the project was to assess whether the first spontaneous movements of a computer mouse, when making an assessment on a scale presented on the screen, may express a respondent’s implicit attitudes. In Study 1, the altruistic behaviors of 66 students were assessed. The students were led to believe that the task they were performing was also being performed by another person and they were asked to distribute earnings between themselves and the partner. The participants performed the tasks under conditions with and without distractors. With the distractors, in the first few seconds spontaneous mouse movements on the scale expressed a selfish distribution of money, while later the movements gravitated toward more altruism. In Study 2, 77 Polish students evaluated a painting by a Polish/Jewish painter on a scale. They evaluated it under conditions of full or distracted cognitive abilities. Spontaneous movements of the mouse on the scale were analyzed. In addition, implicit attitudes toward both Poles and Jews were measured with the Implicit Association Test (IAT). A significant association between implicit attitudes (IAT) and spontaneous evaluation of images using a computer mouse was observed in the group with the distractor. The participants with strong implicit in-group favoritism of Poles revealed stronger preference for the Polish painter’s work in the first few seconds of mouse movement. Taken together, these results suggest that spontaneous mouse movements may reveal egoism (in-group favoritism), i.e., processes that were not observed in the participants’ final decisions (clicking on the scale). PMID:28163689

  3. Spontaneous Movements of a Computer Mouse Reveal Egoism and In-group Favoritism.

    Science.gov (United States)

    Maliszewski, Norbert; Wojciechowski, Łukasz; Suszek, Hubert

    2017-01-01

    The purpose of the project was to assess whether the first spontaneous movements of a computer mouse, when making an assessment on a scale presented on the screen, may express a respondent's implicit attitudes. In Study 1, the altruistic behaviors of 66 students were assessed. The students were led to believe that the task they were performing was also being performed by another person and they were asked to distribute earnings between themselves and the partner. The participants performed the tasks under conditions with and without distractors. With the distractors, in the first few seconds spontaneous mouse movements on the scale expressed a selfish distribution of money, while later the movements gravitated toward more altruism. In Study 2, 77 Polish students evaluated a painting by a Polish/Jewish painter on a scale. They evaluated it under conditions of full or distracted cognitive abilities. Spontaneous movements of the mouse on the scale were analyzed. In addition, implicit attitudes toward both Poles and Jews were measured with the Implicit Association Test (IAT). A significant association between implicit attitudes (IAT) and spontaneous evaluation of images using a computer mouse was observed in the group with the distractor. The participants with strong implicit in-group favoritism of Poles revealed stronger preference for the Polish painter's work in the first few seconds of mouse movement. Taken together, these results suggest that spontaneous mouse movements may reveal egoism (in-group favoritism), i.e., processes that were not observed in the participants' final decisions (clicking on the scale).

  4. A portable grid-enabled computing system for a nuclear material study

    International Nuclear Information System (INIS)

    Tsujita, Yuichi; Arima, Tatsumi; Takekawa, Takayuki; Suzuki, Yoshio

    2010-01-01

    We have built a portable grid-enabled computing system specialized for our molecular dynamics (MD) simulation program to study Pu material easily. Experimental approach to reveal properties of Pu materials is often accompanied by some difficulties such as radiotoxicity of actinides. Since a computational approach reveals new aspects to researchers without such radioactive facilities, we address an MD computation. In order to have more realistic results about e.g., melting point or thermal conductivity, we need a large scale of parallel computations. Most of application users who don't have supercomputers in their institutes should use a remote supercomputer. For such users, we have developed the portable and secured grid-enabled computing system to utilize a grid computing infrastructure provided by Information Technology Based Laboratory (ITBL). This system enables us to access remote supercomputers in the ITBL system seamlessly from a client PC through its graphical user interface (GUI). Typically it enables seamless file accesses on the GUI. Furthermore monitoring of standard output or standard error is available to see progress of an executed program. Since the system provides fruitful functionalities which are useful for parallel computing on a remote supercomputer, application users can concentrate on their researches. (author)

  5. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    Science.gov (United States)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  6. Computational Approaches for Revealing the Structure of Membrane Transporters: Case Study on Bilitranslocase

    Directory of Open Access Journals (Sweden)

    Katja Venko

    Full Text Available The structural and functional details of transmembrane proteins are vastly underexplored, mostly due to experimental difficulties regarding their solubility and stability. Currently, the majority of transmembrane protein structures are still unknown and this present a huge experimental and computational challenge. Nowadays, thanks to X-ray crystallography or NMR spectroscopy over 3000 structures of membrane proteins have been solved, among them only a few hundred unique ones. Due to the vast biological and pharmaceutical interest in the elucidation of the structure and the functional mechanisms of transmembrane proteins, several computational methods have been developed to overcome the experimental gap. If combined with experimental data the computational information enables rapid, low cost and successful predictions of the molecular structure of unsolved proteins. The reliability of the predictions depends on the availability and accuracy of experimental data associated with structural information. In this review, the following methods are proposed for in silico structure elucidation: sequence-dependent predictions of transmembrane regions, predictions of transmembrane helix–helix interactions, helix arrangements in membrane models, and testing their stability with molecular dynamics simulations. We also demonstrate the usage of the computational methods listed above by proposing a model for the molecular structure of the transmembrane protein bilitranslocase. Bilitranslocase is bilirubin membrane transporter, which shares similar tissue distribution and functional properties with some of the members of the Organic Anion Transporter family and is the only member classified in the Bilirubin Transporter Family. Regarding its unique properties, bilitranslocase is a potentially interesting drug target. Keywords: Membrane proteins, Bilitranslocase, 3D protein structure, Transmembrane region predictors, Helix–helix interactions

  7. Computational redesign reveals allosteric mutation hotspots of organophosphate hydrolase that enhance organophosphate hydrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Reed B. [Univ. of North Carolina, Chapel Hill, NC (United States); Ding, Feng [Clemson Univ., SC (United States); Ye, Dongmei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ackerman, Eric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dokholyan, Nikolay V. [Univ. of North Carolina, Chapel Hill, NC (United States)

    2015-04-01

    Organophosphates are widely used for peaceful (agriculture) and military purposes (chemical warfare agents). The extraordinary toxicity of organophosphates and the risk of deployment, make it critical to develop means for their rapid and efficient deactivation. Organophosphate hydrolase (OPH) already plays an important role in organophosphate remediation, but is insufficient for therapeutic or prophylactic purposes primarily due to low substrate affinity. Current efforts focus on directly modifying the active site to differentiate substrate specificity and increase catalytic activity. Here, we present a novel strategy for enhancing the general catalytic efficiency of OPH through computational redesign of the residues that are allosterically coupled to the active site and validated our design by mutagenesis. Specifically, we identify five such hot-spot residues for allosteric regulation and assay these mutants for hydrolysis activity against paraoxon, a chemical-weapons simulant. A high percentage of the predicted mutants exhibit enhanced activity over wild-type (kcat =16.63 s-1), such as T199I/T54I (899.5 s-1) and C227V/T199I/T54I (848 s-1), while the Km remains relatively unchanged in our high-throughput cell-free expression system. Further computational studies of protein dynamics reveal four distinct distal regions coupled to the active site that display significant changes in conformation dynamics upon these identified mutations. These results validate a computational design method that is both efficient and easily adapted as a general procedure for enzymatic enhancement.

  8. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  9. Asymmetric energy flow in liquid alkylbenzenes: A computational study

    International Nuclear Information System (INIS)

    Leitner, David M.; Pandey, Hari Datt

    2015-01-01

    Ultrafast IR-Raman experiments on substituted benzenes [B. C. Pein et al., J. Phys. Chem. B 117, 10898–10904 (2013)] reveal that energy can flow more efficiently in one direction along a molecule than in others. We carry out a computational study of energy flow in the three alkyl benzenes, toluene, isopropylbenzene, and t-butylbenzene, studied in these experiments, and find an asymmetry in the flow of vibrational energy between the two chemical groups of the molecule due to quantum mechanical vibrational relaxation bottlenecks, which give rise to a preferred direction of energy flow. We compare energy flow computed for all modes of the three alkylbenzenes over the relaxation time into the liquid with energy flow through the subset of modes monitored in the time-resolved Raman experiments and find qualitatively similar results when using the subset compared to all the modes

  10. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  11. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  12. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  13. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  14. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  15. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    Science.gov (United States)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  16. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    Science.gov (United States)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  17. Computed tomography of post-traumatic orbito-palpebral emphysema

    International Nuclear Information System (INIS)

    Nose, Harumi; Kohno, Keiko

    1981-01-01

    Two cases of orbito-palpebral emphysema are described. Both having a history of recent facial trauma, emphysema occurred after blowing the nose. They were studied by computed tomography and plain x-ray film, including tomograms of the orbit. The emphysema was revealed by computed tomography and x-ray film, but more clearly by the former technique. The fracture lines of the orbit were revealed in only one case by x-ray film, but in both cases by computed tomography. The authors stress that computed tomography is the best technique for the study of orbital emphysema. (author)

  18. Malignant hemangiopericytoma. A correlative study of angiography, computed tomography, and pathology

    Energy Technology Data Exchange (ETDEWEB)

    Higa, Toshiaki; Kuroda, Yasumasa; Kobashi, Yoichiro; Ichijima, Kunio; Odori, Teruo; Torizuka, Kanji [Kyoto Univ. (Japan). Hospital

    1983-01-01

    Four cases of primary and secondary malignant hemangiopericytoma were correlatively studied using selective angiography, computed tomography, and pathologic specimens. One case was found in each of the peritoneal space (metastasis), the left middle cranial fossa, the left thigh, and the left retroperitoneal space. Basic angiographic features of the tumor were a few feeding arteries entering the tumor with radially arranged fine network throughout, early visualization of veins due to A-V shunting, and a long-standing tumor stain with avascular area (s). Correlative evaluation of angiograms, computed tomograms, and pathologic specimens revealed hemorrhage, central necrosis, and cystic degeneration for the avascular area (s) of the tumor stains on the angiograms. The more prominent those secondary changes, the less characteristic the angiographic features.

  19. Wrist Hypothermia Related to Continuous Work with a Computer Mouse: A Digital Infrared Imaging Pilot Study

    Directory of Open Access Journals (Sweden)

    Jelena Reste

    2015-08-01

    Full Text Available Computer work is characterized by sedentary static workload with low-intensity energy metabolism. The aim of our study was to evaluate the dynamics of skin surface temperature in the hand during prolonged computer mouse work under different ergonomic setups. Digital infrared imaging of the right forearm and wrist was performed during three hours of continuous computer work (measured at the start and every 15 minutes thereafter in a laboratory with controlled ambient conditions. Four people participated in the study. Three different ergonomic computer mouse setups were tested on three different days (horizontal computer mouse without mouse pad; horizontal computer mouse with mouse pad and padded wrist support; vertical computer mouse without mouse pad. The study revealed a significantly strong negative correlation between the temperature of the dorsal surface of the wrist and time spent working with a computer mouse. Hand skin temperature decreased markedly after one hour of continuous computer mouse work. Vertical computer mouse work preserved more stable and higher temperatures of the wrist (>30 °C, while continuous use of a horizontal mouse for more than two hours caused an extremely low temperature (<28 °C in distal parts of the hand. The preliminary observational findings indicate the significant effect of the duration and ergonomics of computer mouse work on the development of hand hypothermia.

  20. Computational study of noise in a large signal transduction network

    Directory of Open Access Journals (Sweden)

    Ruohonen Keijo

    2011-06-01

    Full Text Available Abstract Background Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. Results We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. Conclusions We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies.

  1. A computational study of the topology of vortex breakdown

    Science.gov (United States)

    Spall, Robert E.; Gatski, Thomas B.

    1991-01-01

    A fully three-dimensional numerical simulation of vortex breakdown using the unsteady, incompressible Navier-Stokes equations has been performed. Solutions to four distinct types of breakdown are identified and compared with experimental results. The computed solutions include weak helical, double helix, spiral, and bubble-type breakdowns. The topological structure of the various breakdowns as well as their interrelationship are studied. The data reveal that the asymmetric modes of breakdown may be subject to additional breakdowns as the vortex core evolves in the streamwise direction. The solutions also show that the freestream axial velocity distribution has a significant effect on the position and type of vortex breakdown.

  2. Case Study: Creation of a Degree Program in Computer Security. White Paper.

    Science.gov (United States)

    Belon, Barbara; Wright, Marie

    This paper reports on research into the field of computer security, and undergraduate degrees offered in that field. Research described in the paper reveals only one computer security program at the associate's degree level in the entire country. That program, at Texas State Technical College in Waco, is a 71-credit-hour program leading to an…

  3. Computational study of NMDA conductance and cortical oscillations in schizophrenia

    Directory of Open Access Journals (Sweden)

    Kubra eKomek Kirli

    2014-10-01

    Full Text Available N-methyl-D-aspartate (NMDA receptor hypofunction has been implicated in the pathophysiology of schizophrenia. The illness is also characterized by gamma oscillatory disturbances, which can be evaluated with precise frequency specificity employing auditory cortical entrainment paradigms. This computational study investigates how synaptic NMDA hypofunction may give rise to network level oscillatory deficits as indexed by entrainment paradigms. We developed a computational model of a local cortical circuit with pyramidal cells and fast-spiking interneurons (FSI, incorporating NMDA, α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic (AMPA, and γ-aminobutyric acid (GABA synaptic kinetics. We evaluated the effects of varying NMDA conductance on FSIs and pyramidal cells, as well as AMPA to NMDA ratio. We also examined the differential effects across a broad range of entrainment frequencies as a function of NMDA conductance. Varying NMDA conductance onto FSIs revealed an inverted-U relation with network gamma whereas NMDA conductance onto the pyramidal cells had a more monotonic relationship. Varying NMDA vs. AMPA conductance onto FSIs demonstrated the necessity of AMPA in the generation of gamma while NMDA receptors had a modulatory role. Finally, reducing NMDA conductance onto FSI and varying the stimulus input frequency reproduced the specific reductions in gamma range (~40 Hz as observed in schizophrenia studies. Our computational study showed that reductions in NMDA conductance onto FSIs can reproduce similar disturbances in entrainment to periodic stimuli within the gamma range as reported in schizophrenia studies. These findings provide a mechanistic account of how specific cellular level disturbances can give rise to circuitry level pathophysiologic disturbance in schizophrenia.

  4. A study of computer-related upper limb discomfort and computer vision syndrome.

    Science.gov (United States)

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  5. Faculty Attitudes towards Computer Assisted Instruction at the University of Gaziantep

    Directory of Open Access Journals (Sweden)

    Filiz Yalçın TILFARLIOĞLU

    2006-04-01

    Full Text Available This study aims at revealing faculty attitudes towards computer assistedinstruction at University of Gaziantep, Turkey in a multifaceted way. Additionally, ittries to determine underlying factors that shape these attitudes. After a pilot study, thequestionnaire was applied to a sample population of 145 faculty that were chosenrandomly. The results revealed that faculty attitudes towards computer assistedinsruction are positive. Age, sex, teaching experience, level of proficiency in Englishand computer usage skills have no or little effects over these attitudes.According to theresults of the study, faculty who have prior knowledge on computers expose ratherpositive attitudes towards computers in education.Another important outcome of thestudy is the existence of a gender gap in terms of computer assisted instruction.Althoughthere seems to be no difference between male and female faculty concerning theirbackground education regarding computers, male faculty feel confident about thematter, whereas female faculty feel uncomfortable about using computers in theirlessons.

  6. Pd-Catalyzed N-Arylation of Secondary Acyclic Amides: Catalyst Development, Scope, and Computational Study

    Science.gov (United States)

    Hicks, Jacqueline D.; Hyde, Alan M.; Cuezva, Alberto Martinez; Buchwald, Stephen L.

    2009-01-01

    We report the efficient N-arylation of acyclic secondary amides and related nucleophiles with aryl nonaflates, triflates, and chlorides. This method allows for easy variation of the aromatic component in tertiary aryl amides. A new biaryl phosphine with P-bound 3,5-(bis)trifluoromethylphenyl groups was found to be uniquely effective for this amidation. The critical aspects of the ligand were explored through synthetic, mechanistic, and computational studies. Systematic variation of the ligand revealed the importance of (1) a methoxy group on the aromatic carbon of the “top ring” ortho to the phosphorus and (2) two highly electron-withdrawing P-bound 3,5-(bis)trifluoromethylphenyl groups. Computational studies suggest the electron-deficient nature of the ligand is important in facilitating amide binding to the LPd(II)(Ph)(X) intermediate. PMID:19886610

  7. Positron computed tomography studies: potential use in neuro-psychiatric disorders

    International Nuclear Information System (INIS)

    Yamasaki, T.; Tateno, Y.; Shishido, F.

    1982-01-01

    Since November 1979 positron computed tomography (PCT) have been performed to study subjects in a variety of states and varied disorders, using 13 NH 3 , 11 CO and 18 F-2-fluorodeoxyglucose ( 18 FDG) at the National Institute of Radiological Sciences, Japan. In neuro-psychiatric studies, normal volunteers and patients including schizophrenia, affective disorders, Alzheimer's disease, Huntigton's chorea were studied. Tomographic images were analyzed by visual observation and activity counting in regions selected. In degenerative disorder group, 18 FDG revealed decreased accumulation in target areas, whereas in functional psychosis group both in medicating patients and in non-medicated patients, positron images were basically similar to normal controls. Especially in a patient with Huntington's chorea, 18 FDG accumulation in striatal region was markedly decreased without significant change in the same region on X-ray CT and 13 NH 3 PCT

  8. Revealing −1 Programmed Ribosomal Frameshifting Mechanisms by Single-Molecule Techniques and Computational Methods

    Directory of Open Access Journals (Sweden)

    Kai-Chun Chang

    2012-01-01

    Full Text Available Programmed ribosomal frameshifting (PRF serves as an intrinsic translational regulation mechanism employed by some viruses to control the ratio between structural and enzymatic proteins. Most viral mRNAs which use PRF adapt an H-type pseudoknot to stimulate −1 PRF. The relationship between the thermodynamic stability and the frameshifting efficiency of pseudoknots has not been fully understood. Recently, single-molecule force spectroscopy has revealed that the frequency of −1 PRF correlates with the unwinding forces required for disrupting pseudoknots, and that some of the unwinding work dissipates irreversibly due to the torsional restraint of pseudoknots. Complementary to single-molecule techniques, computational modeling provides insights into global motions of the ribosome, whose structural transitions during frameshifting have not yet been elucidated in atomic detail. Taken together, recent advances in biophysical tools may help to develop antiviral therapies that target the ubiquitous −1 PRF mechanism among viruses.

  9. Mesenteric panniculitis: computed tomography aspects

    International Nuclear Information System (INIS)

    Moreira, Luiza Beatriz Melo; Alves, Jose Ricardo Duarte; Marchiori, Edson; Pinheiro, Ricardo Andrade; Melo, Alessandro Severo Alves de; Noro, Fabio

    2001-01-01

    Mesenteric panniculitis is an inflammatory process that represents the second stage of a rare progressive disease involving the adipose tissue of the mesentery. Imaging methods used in the diagnosis of mesenteric panniculitis include barium studies, ultrasonography, computed tomography and magnetic resonance imaging. Computed tomography is important for both, diagnosis and evaluation of the extension of the disease and treatment monitoring. Computed tomography findings may vary according to the stage of the disease and the amount of inflammatory material or fibrosis. There is also good correlation between the computed tomography and anatomical pathology findings. The authors studied 10 patients with mesenteric panniculitis submitted to computed tomography. Magnetic resonance imaging was also performed in one patient. In all patients, computed tomography revealed a heterogeneous mass in the mesentery with density of fat, interspersed with areas of soft tissue density and dilated vessels. (author)

  10. Prevalence of computer vision syndrome in Erbil

    OpenAIRE

    Dler Jalal Ahmed; Eman Hussein Alwan

    2018-01-01

    Background and objective: Nearly all colleges, universities and homes today are regularly using video display terminals, such as computer, iPad, mobile, and TV. Very little research has been carried out on Kurdish users to reveal the effect of video display terminals on the eye and vision. This study aimed to evaluate the prevalence of computer vision syndrome among computer users. Methods: A hospital based cross-sectional study was conducted in the Ophthalmology Department of Rizgary...

  11. Revealed Preference Methods for Studying Bicycle Route Choice—A Systematic Review

    Directory of Open Access Journals (Sweden)

    Ray Pritchard

    2018-03-01

    Full Text Available One fundamental aspect of promoting utilitarian bicycle use involves making modifications to the built environment to improve the safety, efficiency and enjoyability of cycling. Revealed preference data on bicycle route choice can assist greatly in understanding the actual behaviour of a highly heterogeneous group of users, which in turn assists the prioritisation of infrastructure or other built environment initiatives. This systematic review seeks to compare the relative strengths and weaknesses of the empirical approaches for evaluating whole journey route choices of bicyclists. Two electronic databases were systematically searched for a selection of keywords pertaining to bicycle and route choice. In total seven families of methods are identified: GPS devices, smartphone applications, crowdsourcing, participant-recalled routes, accompanied journeys, egocentric cameras and virtual reality. The study illustrates a trade-off in the quality of data obtainable and the average number of participants. Future additional methods could include dockless bikeshare, multiple camera solutions using computer vision and immersive bicycle simulator environments.

  12. Educational NASA Computational and Scientific Studies (enCOMPASS)

    Science.gov (United States)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and

  13. Evaluation of valvular heart diseases with computed tomography

    International Nuclear Information System (INIS)

    Tomoda, Haruo; Hoshiai, Mitsumoto; Matsuyama, Seiya

    1982-01-01

    Forty-two patients with valvular heart diseases were studied with a third-generation computed tomographic system. The cardiac chambers (the atria and ventricles) were evaluated semiquantitatively, and valvular calcification was easily detected with computed tomography. Computed tomography was most valuable in revealing left atrial thrombi which were not identified by other diagnostic procedures in some cases. (author)

  14. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  15. A computed tomographic study on epilepsy

    International Nuclear Information System (INIS)

    Bae, Hoon Sik

    1980-01-01

    140 patients with epileptic seizure were studied by computed tomography during the period from Feb. 1979 to Aug. 1979 in the Department or Radiology, College of Medicine, Hangyang University. Those findings on CT and clinical records including EEG findings were reviewed. The results were as follows: 1. Age distribution of the total 140 patients was broad ranging from 1 month to 63 years. 73.5% of patients was below the age of 30. The patient population was comprised of 93 males and 47 females, and its male to female ratio was 2 : 1. 2. The type of epileptic seizure were classified according to the International League against Epilepsy. 42.9% of patients had primary generalized seizure, 47.1% with partial seizure, and 10% with non classifiable seizure. 3. As additional symptoms and signs except seizure, headache was most common, and the next was nausea and vomiting. Uncommonly, there were also insomnia, personality change, and memory disturbance. 4. 37.1% of patients had less than 1 month of seizure history, 19.3% between 1 year and 5 years. 5. EEG findings were available in 41 patients, and normal in 15 cases. 26 patients revealed abnormal findings. Among those abnormal findings focal slowing was appeared in 19.5% and generalized slowing in 17.1%. 6. 52% of patients showed abnormal findings on CT. The most common abnormal findings was focal low density (30%), and the next was diffuse hydrocephalus (7.1%). After contrast infusion, contrast enhancement was occurred in cases with focal low density, focal high or isodense mass density. In patients with focal low density, ring or nodular enhancement were common, and diffuse or serpentime enhancement in focal high or isodence mass density. 7. The frequency of structural abnormalities on CT was more common in patients below the age of 10 and over 30 than other age groups. The epilepsy starting below 10 and over 30 years of age showed structural abnormalities in 63.6-100%. 8. The patients who had less than 6 months of

  16. A computed tomographic study on epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Hoon Sik [Hanyang University College of Medicine, Seoul (Korea, Republic of)

    1980-06-15

    140 patients with epileptic seizure were studied by computed tomography during the period from Feb. 1979 to Aug. 1979 in the Department or Radiology, College of Medicine, Hangyang University. Those findings on CT and clinical records including EEG findings were reviewed. The results were as follows: 1. Age distribution of the total 140 patients was broad ranging from 1 month to 63 years. 73.5% of patients was below the age of 30. The patient population was comprised of 93 males and 47 females, and its male to female ratio was 2 : 1. 2. The type of epileptic seizure were classified according to the International League against Epilepsy. 42.9% of patients had primary generalized seizure, 47.1% with partial seizure, and 10% with non classifiable seizure. 3. As additional symptoms and signs except seizure, headache was most common, and the next was nausea and vomiting. Uncommonly, there were also insomnia, personality change, and memory disturbance. 4. 37.1% of patients had less than 1 month of seizure history, 19.3% between 1 year and 5 years. 5. EEG findings were available in 41 patients, and normal in 15 cases. 26 patients revealed abnormal findings. Among those abnormal findings focal slowing was appeared in 19.5% and generalized slowing in 17.1%. 6. 52% of patients showed abnormal findings on CT. The most common abnormal findings was focal low density (30%), and the next was diffuse hydrocephalus (7.1%). After contrast infusion, contrast enhancement was occurred in cases with focal low density, focal high or isodense mass density. In patients with focal low density, ring or nodular enhancement were common, and diffuse or serpentime enhancement in focal high or isodence mass density. 7. The frequency of structural abnormalities on CT was more common in patients below the age of 10 and over 30 than other age groups. The epilepsy starting below 10 and over 30 years of age showed structural abnormalities in 63.6-100%. 8. The patients who had less than 6 months of

  17. Clinical study on eating disorders. Brain atrophy revealed by cranial computed tomography scans

    Energy Technology Data Exchange (ETDEWEB)

    Nishiwaki, Shinichi

    1988-06-01

    Cranial computed tomography (CT) scans were reviewed in 34 patients with anorexia nervosa (Group I) and 22 with bulimia (Group II) to elucidate the cause and pathological significance of morphological brain alterations. The findings were compared with those from 47 normal women. The incidence of brain atrophy was significantly higher in Group I (17/34, 50%) and Group II (11/22, 50%) than the control group (3/47, 6%). In Group I, there was a significant increase in the left septum-caudate distance, the maximum width of interhemispheric fissure, the width of the both-side Sylvian fissures adjacent to the skull, and the maximum width of the third ventricle. A significant increase in the maximum width of interhemispheric fissure and the width of the left-side Sylvian fissure adjacent to the skull were noted as well in Group II. Ventricular brain ratios were significantly higher in Groups I and II than the control group (6.76 and 7.29 vs 4.55). Brain atrophy did not correlate with age, body weight, malnutrition, eating behavior, depression, thyroid function, EEG findings, or intelligence scale. In Group I, serum cortisol levels after the administration of dexamethasone were correlated with ventricular brain ratio. (Namekawa, K) 51 refs.

  18. Quantitative Study on Computer Self-Efficacy and Computer Anxiety Differences in Academic Major and Residential Status

    Science.gov (United States)

    Binkley, Zachary Wayne McClellan

    2017-01-01

    This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…

  19. The study of radiographic technique with low exposure using computed panoramic tomography

    International Nuclear Information System (INIS)

    Saito, Yasuhiro

    1987-01-01

    A new imaging system for the dental field that combines recent advances in both the electronics and computer technologies was developed. This new imaging system is a computed panoramic tomography process based on the newly developed laser-scan system. In this study a quantitative image evaluation was performed comparing anatomical landmark in computed panoramic tomography at a low exposure (LPT) and in conventional panoramic tomography at a routin (CPT), and the following results were obtained: 1. The diagnostic value of the CPT decreased with decreasing exposure, paticularly with regard to the normal anatomical landmarks of such microstructural parts as the periodontal space, lamina dura and the enamel-dentin border. 2. The LPT was highly diagnostic value for all normal anatomical landmark, averaging about twice as valuable diagnostically as CPT. 3. The visually diagnostic value of the periodontal space, lamina dura, enamel-dentin border and the anatomical morphology of the teeth on the LPT beeing slightly dependent on the spatial frequency enhancement rank. 4. The LPT formed images with almost the same range of density as the CPT. 5. Computed panoramic tomographs taken at a low exposure revealed more information of the trabecular bone pattern on the image than conventional panoramic tomographs taken under routine condition in the visual spatial frequency range (0.1 - 5.0 cycle/mm). (author) 67 refs

  20. Study guide to accompany computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Study Guide to Accompany Computer and Data Processing provides information pertinent to the fundamental aspects of computers and computer technology. This book presents the key benefits of using computers.Organized into five parts encompassing 19 chapters, this book begins with an overview of the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. This text then introduces computer hardware and describes the processor. Other chapters describe how microprocessors are made and describe the physical operation of computers. This book discusses as w

  1. Determining the Differences in Gender Usage of Computers in Nigeria

    African Journals Online (AJOL)

    The study also revealed that female students preferred educational and programming computer related jobs to hi-technical and field computer jobs which male students preferred most. However male and female students have high preference for managerial related computer jobs and show less interest in low position ...

  2. Synchrotron-radiation-based X-ray micro-computed tomography reveals dental bur debris under dental composite restorations.

    Science.gov (United States)

    Hedayat, Assem; Nagy, Nicole; Packota, Garnet; Monteith, Judy; Allen, Darcy; Wysokinski, Tomasz; Zhu, Ning

    2016-05-01

    Dental burs are used extensively in dentistry to mechanically prepare tooth structures for restorations (fillings), yet little has been reported on the bur debris left behind in the teeth, and whether it poses potential health risks to patients. Here it is aimed to image dental bur debris under dental fillings, and allude to the potential health hazards that can be caused by this debris when left in direct contact with the biological surroundings, specifically when the debris is made of a non-biocompatible material. Non-destructive micro-computed tomography using the BioMedical Imaging & Therapy facility 05ID-2 beamline at the Canadian Light Source was pursued at 50 keV and at a pixel size of 4 µm to image dental bur fragments under a composite resin dental filling. The bur's cutting edges that produced the fragment were also chemically analyzed. The technique revealed dental bur fragments of different sizes in different locations on the floor of the prepared surface of the teeth and under the filling, which places them in direct contact with the dentinal tubules and the dentinal fluid circulating within them. Dispersive X-ray spectroscopy elemental analysis of the dental bur edges revealed that the fragments are made of tungsten carbide-cobalt, which is bio-incompatible.

  3. Students' Computing Use and Study: When More is Less

    Directory of Open Access Journals (Sweden)

    Christine A McLachlan

    2016-02-01

    Full Text Available Since the turn of the century there has been a steady decline in enrolments of students in senior secondary computing classes in Australia. A flow on effect has seen reduced enrolments in tertiary computing courses and the subsequent predictions of shortages in skilled computing professionals. This paper investigates the relationship between students’ computing literacy levels, their use and access to computing tools, and students’ interest in and attitudes to formal computing study. Through the use of secondary data obtained from Australian and international reports, a reverse effect was discovered indicating that the more students used computing tools, the less interested they become in computing studies. Normal 0 false false false EN-AU X-NONE X-NONE

  4. Computed tomography angiography reveals stenosis and aneurysmal dilation of an aberrant right subclavian artery causing systemic blood pressure misreading in an old Pekinese dog.

    Science.gov (United States)

    Kim, Jaehwan; Eom, Kidong; Yoon, Hakyoung

    2017-06-16

    A 14-year-old dog weighing 4 kg presented with hypotension only in the right forelimb. Thoracic radiography revealed a round soft tissue opacity near the aortic arch and below the second thoracic vertebra on a lateral view. Three-dimensional computed tomography angiography clearly revealed stenosis and aneurysmal dilation of an aberrant right subclavian artery. Stenosis and aneurysm of an aberrant subclavian artery should be included as a differential diagnosis in dogs showing a round soft tissue opacity near the aortic arch and below the thoracic vertebra on the lateral thoracic radiograph.

  5. Chest computed tomography of a patient revealing severe hypoxia due to amniotic fluid embolism: a case report

    Directory of Open Access Journals (Sweden)

    Inui Daisuke

    2010-02-01

    Full Text Available Abstract Introduction Amniotic fluid embolism is one of the most severe complications in the peripartum period. Because its onset is abrupt and fulminant, it is unlikely that there will be time to examine the condition using thoracic computed tomography (CT. We report a case of life-threatening amniotic fluid embolism, where chest CT in the acute phase was obtained. Case presentation A 22-year-old Asian Japanese primiparous woman was suspected of having an amniotic fluid embolism. After a Cesarean section for cephalopelvic disproportion, her respiratory condition deteriorated. Her chest CT images were examined. CT findings revealed diffuse homogeneous ground-glass shadow in her bilateral peripheral lung fields. She was therefore transferred to our hospital. On admission to our hospital's intensive care unit, she was found to have severe hypoxemia, with SpO2 of 50% with a reservoir mask of 15 L/min oxygen. She was intubated with the support of noninvasive positive pressure ventilation. She was successfully extubated on the sixth day, and discharged from the hospital on the twentieth day. Conclusion This is the first case report describing amniotic fluid embolism in which CT revealed an acute respiratory distress syndrome-like shadow.

  6. Synthesis, biological activity and computational studies of novel azo-compounds

    International Nuclear Information System (INIS)

    Ashraf, J.; Murtaza, S.; Mughal, E.U.; Sadiq, A.

    2017-01-01

    In the present protocol, we report the synthesis and characterization of some novel azo-compounds starting from 4-methoxyaniline and 4-aminophenazone, which were diazotized at low temperature. 4-nitrophenol, 2-aminobenzoic acid, benzamide, 4-aminobenzoic acid, resorcinol, o-bromonitrobenzene and 2-nitroaniline were used as active aromatic coupling compounds for the second step. The synthesized compounds were investigated for their potential antibacterial activities by using disc diffusion method against Escherichia coli, Shigellasonnei, Streptococcus pyrogenes, Staphylococcus aureus and Neisseria gonorrhoeae strains. They were also subjected to antioxidant activities by using DPPH method. Results revealed that the compounds of 4-methoxyaniline and 4-aminophenazone showed good antibacterial activity against all strains, where as some azo-compounds have moderate to good antioxidant activities. Furthermore, these compounds were studied by computational analysis. (author)

  7. Neural correlates reveal sub-lexical orthography and phonology during reading aloud: A review

    Directory of Open Access Journals (Sweden)

    Kalinka eTimmer

    2014-08-01

    Full Text Available The sub-lexical conversion of graphemes-to-phonemes (GPC during reading has been investigated extensively with behavioral measures, as well as event-related potentials (ERPs. Most research utilizes silent reading (e.g., lexical decision task for which phonological activation is not a necessity. However, recent research employed reading aloud to capture sub-lexical GPC. The masked priming paradigm avoids strategic processing and is therefore well suitable for capturing sub-lexical processing instead of lexical effects. By employing ERPs, the on-line time course of sub-lexical GPC can be observed before the overt response. ERPs have revealed that besides phonological activation, as revealed by behavioral studies, there is also early orthographic activation. This review describes studies in one’s native language, in one’s second language, and in a cross-language situation. We discuss the implications the ERP results have on different (computational models. First, the ERP results show that computational models should assume an early locus of the grapheme-to-phoneme-conversion (GPC. Second, cross-language studies reveal that the phonological representations from both languages of a bilingual become activated automatically and the phonology belonging to the context is selected rapidly. Therefore, it is important to extend the scope of computational models of reading (aloud to multiple lexicons.

  8. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  9. Mathematics revealed

    CERN Document Server

    Berman, Elizabeth

    1979-01-01

    Mathematics Revealed focuses on the principles, processes, operations, and exercises in mathematics.The book first offers information on whole numbers, fractions, and decimals and percents. Discussions focus on measuring length, percent, decimals, numbers as products, addition and subtraction of fractions, mixed numbers and ratios, division of fractions, addition, subtraction, multiplication, and division. The text then examines positive and negative numbers and powers and computation. Topics include division and averages, multiplication, ratios, and measurements, scientific notation and estim

  10. A Comparative Study of Paper-based and Computer-based Contextualization in Vocabulary Learning of EFL Students

    Directory of Open Access Journals (Sweden)

    Mousa Ahmadian

    2015-04-01

    Full Text Available Vocabulary acquisition is one of the largest and most important tasks in language classes. New technologies, such as computers, have helped a lot in this way. The importance of the issue led the researchers to do the present study which concerns the comparison of contextualized vocabulary learning on paper and through Computer Assisted Language Learning (CALL. To this end, 52 Pre-university EFL learners were randomly assigned in two groups: a paper-based group (PB and a computer-based (CB group each with 26 learners. The PB group received PB contextualization of vocabulary items, while the CB group received CB contextualization of the vocabulary items thorough PowerPoint (PP software. One pretest, posttest, along with an immediate and a delayed posttest were given to the learners. Paired samples t-test of pretest and posttest and independent samples t-test of the delayed and immediate posttest were executed by SPSS software. The results revealed that computer-based contextualization had more effects on vocabulary learning of Iranian EFL learners than paper-based contextualization of the words. Keywords: Computer-based contextualization, Paper-based contextualization, Vocabulary learning, CALL

  11. Binding of indomethacin methyl ester to cyclooxygenase-2. A computational study.

    Science.gov (United States)

    Sárosi, Menyhárt-Botond

    2018-06-05

    Inhibitors selective towards the second isoform of prostaglandin synthase (cyclooxygenase, COX-2) are promising nonsteroidal anti-inflammatory drugs and antitumor medications. Methylation of the carboxylate group in the relatively nonselective COX inhibitor indomethacin confers significant COX-2 selectivity. Several other modifications converting indomethacin into a COX-2 selective inhibitor have been reported. Earlier experimental and computational studies on neutral indomethacin derivatives suggest that the methyl ester derivative likely binds to COX-2 with a similar binding mode as that observed for the parent indomethacin. However, docking studies followed by molecular dynamics simulations revealed two possible binding modes in COX-2 for indomethacin methyl ester, which differs from the experimental binding mode found for indomethacin. Both alternative binding modes might explain the observed COX-2 selectivity of indomethacin methyl ester. Graphical abstract Binding of indomethacin methyl ester to cyclooxygenase-2.

  12. Mg co-ordination with potential carcinogenic molecule acrylamide: Spectroscopic, computational and cytotoxicity studies

    Science.gov (United States)

    Singh, Ranjana; Mishra, Vijay K.; Singh, Hemant K.; Sharma, Gunjan; Koch, Biplob; Singh, Bachcha; Singh, Ranjan K.

    2018-03-01

    Acrylamide (acr) is a potential toxic molecule produced in thermally processed food stuff. Acr-Mg complex has been synthesized chemically and characterized by spectroscopic techniques. The binding sites of acr with Mg were identified by experimental and computational methods. Both experimental and theoretical results suggest that Mg coordinated with the oxygen atom of Cdbnd O group of acr. In-vitro cytotoxicity studies revealed significant decrease in the toxic level of acr-Mg complex as compared to pure acr. The decrease in toxicity on complexation with Mg may be a useful step for future research to reduce the toxicity of acr.

  13. Follow up study of Alzheimer's type dementia with computed tomography

    International Nuclear Information System (INIS)

    Hirata, Nobuhide

    1987-01-01

    In 54 patients who were diagnosed as having Alzheimer's type dementia based on the Diagnostic and Statistical Manual of Mental Disorders, III, cranial computed tomography (CT) scans were obtained before and after their follow-up study ranging from 6 to 24 months (mean 15.4 +- 4.7 months). Cerebrospinal percentage and CT density in various regions of interest were examined. Six patients died during the study. Comparison of the group of the deceased (Group I) with the group of survivors (Group II) revealed: (1) there was no difference in average age and the degree of mental disorder at first presentation; (2) Group I had decreased activities of daily living; and (3) CT density was significantly decreased in the bilateral lateral and frontal lobes in Group I. As for Group II, decreased CT numbers were noticeable during the follow-up period in the frontal lobe, parietal lobe, and caudate nucleus in the group evaluated as aggravated, as compared with the group evaluated as unchanged. (Namekawa, K.)

  14. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  15. A Codesign Case Study in Computer Graphics

    DEFF Research Database (Denmark)

    Brage, Jens P.; Madsen, Jan

    1994-01-01

    The paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  16. Computational studies of tokamak plasmas

    International Nuclear Information System (INIS)

    Takizuka, Tomonori; Tsunematsu, Toshihide; Tokuda, Shinji

    1981-02-01

    Computational studies of tokamak plasmas are extensively advanced. Many computational codes have been developed by using several kinds of models, i.e., the finite element formulation of MHD equations, the time dependent multidimensional fluid model, and the particle model with the Monte-Carlo method. These codes are applied to the analyses of the equilibrium of an axisymmetric toroidal plasma (SELENE), the time evolution of the high-beta tokamak plasma (APOLLO), the low-n MHD stability (ERATO-J) and high-n ballooning mode stability (BOREAS) in the INTOR tokamak, the nonlinear MHD stability, such as the positional instability (AEOLUS-P), resistive internal mode (AEOLUS-I) etc., and the divertor functions. (author)

  17. Exact Dispersion Study of an Asymmetric Thin Planar Slab Dielectric Waveguide without Computing {d^2}β/{d{k^2}} Numerically

    Science.gov (United States)

    Raghuwanshi, Sanjeev Kumar; Palodiya, Vikram

    2017-08-01

    Waveguide dispersion can be tailored but not the material dispersion. Hence, the total dispersion can be shifted at any desired band by adjusting the waveguide dispersion. Waveguide dispersion is proportional to {d^2}β/d{k^2} and need to be computed numerically. In this paper, we have tried to compute analytical expression for {d^2}β/d{k^2} in terms of {d^2}β/d{k^2} accurately with numerical technique, ≈ 10^{-5} decimal point. This constraint sometimes generates the error in calculation of waveguide dispersion. To formulate the problem we will use the graphical method. Our study reveals that we can compute the waveguide dispersion enough accurately for various modes by knowing - β only.

  18. Non-Determinism: An Abstract Concept in Computer Science Studies

    Science.gov (United States)

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  19. Experimental and computational studies of nanofluids

    Science.gov (United States)

    Vajjha, Ravikanth S.

    The goals of this dissertation were (i) to experimentally investigate the fluid dynamic and heat transfer performance of nanofluids in a circular tube, (ii) to study the influence of temperature and particle volumetric concentration of nanofluids on thermophysical properties, heat transfer and pumping power, (iii) to measure the rheological properties of various nanofluids and (iv) to investigate using a computational fluid dynamic (CFD) technique the performance of nanofluids in the flat tube of a radiator. Nanofluids are a new class of fluids prepared by dispersing nanoparticles with average sizes of less than 100 nm in traditional heat transfer fluids such as water, oil, ethylene glycol and propylene glycol. In cold regions of the world, the choice of base fluid for heat transfer applications is an ethylene glycol or propylene glycol mixed with water in different proportions. In the present research, a 60% ethylene glycol (EG) or propylene glycol (PG) and 40% water (W) by mass fluid mixture (60:40 EG/W or 60:40 PG/W) was used as a base fluid, which provides freeze protection to a very low level of temperature. Experiments were conducted to measure the convective heat transfer coefficient and pressure loss of nanofluids flowing in a circular tube in the fully developed turbulent regime. The experimental measurements were carried out for aluminum oxide (Al2O3), copper oxide (CuO) and silicon dioxide (SiO2) nanoparticles dispersed in 60:40 EG/W base fluid. Experiments revealed that the heat transfer coefficient of nanofluids showed an increase with the particle volumetric concentration. Pressure loss was also observed to increase with the nanoparticle volumetric concentration. New correlations for the Nusselt number and the friction factor were developed. The effects of temperature and particle volumetric concentration on different thermophysical properties (e.g. viscosity, thermal conductivity, specific heat and density) and subsequently on the Prandtl number

  20. Mind the Gap: An attempt to bridge computational and neuroscientific approaches to study creativity

    Directory of Open Access Journals (Sweden)

    Geraint eWiggins

    2014-07-01

    Full Text Available Creativity is the hallmark of human cognition, yet scientific understanding of creative processes is limited. However, there is increasing interest in revealing the neural correlates of human creativity. Though many of these studies, pioneering in nature, help demystification of creativity, but the field is still dominated by popular beliefs in associating creativity with right brain thinking, divergent thinking, altered states and so on (Dietrich and Kanso, 2010 . In this article, we discuss a computational framework for creativity based on Baars' global workspace theory (Baars, 1988 enhanced with mechanisms based on information theory. Next we propose a neurocognitive architecture of creativity with a strong focus on various facets (i.e., unconscious thought theory, mind wandering, spontaneous brain states of un/pre-conscious brain responses. Our principal argument is that pre-conscious creativity happens prior to conscious creativity happens prior to conscious creativity and the proposed computational model may provide a mechanism by which this transition is managed. This integrative approach, albeit unconventional, will hopefully stimulate future neuroscientific studies of the inscrutable phenomenon of creativity.

  1. Methods and experimental techniques in computer engineering

    CERN Document Server

    Schiaffonati, Viola

    2014-01-01

    Computing and science reveal a synergic relationship. On the one hand, it is widely evident that computing plays an important role in the scientific endeavor. On the other hand, the role of scientific method in computing is getting increasingly important, especially in providing ways to experimentally evaluate the properties of complex computing systems. This book critically presents these issues from a unitary conceptual and methodological perspective by addressing specific case studies at the intersection between computing and science. The book originates from, and collects the experience of, a course for PhD students in Information Engineering held at the Politecnico di Milano. Following the structure of the course, the book features contributions from some researchers who are working at the intersection between computing and science.

  2. Ten Years toward Equity: Preliminary Results from a Follow-Up Case Study of Academic Computing Culture

    Directory of Open Access Journals (Sweden)

    Tanya L. Crenshaw

    2017-05-01

    Full Text Available Just over 10 years ago, we conducted a culture study of the Computer Science Department at the flagship University of Illinois at Urbana-Champaign, one of the top five computing departments in the country. The study found that while the department placed an emphasis on research, it did so in a way that, in conjunction with a lack of communication and transparency, devalued teaching and mentoring, and negatively impacted the professional development, education, and sense of belonging of the students. As one part of a multi-phase case study spanning over a decade, this manuscript presents preliminary findings from our latest work at the university. We detail early comparisons between data gathered at the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2005 and our most recent pilot case study, a follow-up research project completed in 2016. Though we have not yet completed the full data collection, we find it worthwhile to reflect on the pilot case study data we have collected thus far. Our data reveals improvements in the perceptions of undergraduate teaching quality and undergraduate peer mentoring networks. However, we also found evidence of continuing feelings of isolation, incidents of bias, policy opacity, and uneven policy implementation that are areas of concern, particularly with respect to historically underrepresented groups. We discuss these preliminary follow-up findings, offer research and methodological reflections, and share next steps for applied research that aims to create positive cultural change in computing.

  3. Product placement of computer games in cyberspace.

    Science.gov (United States)

    Yang, Heng-Li; Wang, Cheng-Shu

    2008-08-01

    Computer games are considered an emerging media and are even regarded as an advertising channel. By a three-phase experiment, this study investigated the advertising effectiveness of computer games for different product placement forms, product types, and their combinations. As the statistical results revealed, computer games are appropriate for placement advertising. Additionally, different product types and placement forms produced different advertising effectiveness. Optimum combinations of product types and placement forms existed. An advertisement design model is proposed for use in game design environments. Some suggestions are given for advertisers and game companies respectively.

  4. Writing Apprehension, Computer Anxiety and Telecomputing: A Pilot Study.

    Science.gov (United States)

    Harris, Judith; Grandgenett, Neal

    1992-01-01

    A study measured graduate students' writing apprehension and computer anxiety levels before and after using electronic mail, computer conferencing, and remote database searching facilities during an educational technology course. Results indicted postcourse computer anxiety levels significantly related to usage statistics. Precourse writing…

  5. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  6. Computation studies into architecture and energy transfer properties of photosynthetic units from filamentous anoxygenic phototrophs

    Energy Technology Data Exchange (ETDEWEB)

    Linnanto, Juha Matti [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu (Estonia); Freiberg, Arvi [Institute of Physics, University of Tartu, Riia 142, 51014 Tartu, Estonia and Institute of Molecular and Cell Biology, University of Tartu, Riia 23, 51010 Tartu (Estonia)

    2014-10-06

    We have used different computational methods to study structural architecture, and light-harvesting and energy transfer properties of the photosynthetic unit of filamentous anoxygenic phototrophs. Due to the huge number of atoms in the photosynthetic unit, a combination of atomistic and coarse methods was used for electronic structure calculations. The calculations reveal that the light energy absorbed by the peripheral chlorosome antenna complex transfers efficiently via the baseplate and the core B808–866 antenna complexes to the reaction center complex, in general agreement with the present understanding of this complex system.

  7. Letting the ‘cat’ out of the bag: pouch young development of the extinct Tasmanian tiger revealed by X-ray computed tomography

    Science.gov (United States)

    Spoutil, Frantisek; Prochazka, Jan; Black, Jay R.; Medlock, Kathryn; Paddle, Robert N.; Knitlova, Marketa; Hipsley, Christy A.

    2018-01-01

    The Tasmanian tiger or thylacine (Thylacinus cynocephalus) was an iconic Australian marsupial predator that was hunted to extinction in the early 1900s. Despite sharing striking similarities with canids, they failed to evolve many of the specialized anatomical features that characterize carnivorous placental mammals. These evolutionary limitations are thought to arise from functional constraints associated with the marsupial mode of reproduction, in which otherwise highly altricial young use their well-developed forelimbs to climb to the pouch and mouth to suckle. Here we present the first three-dimensional digital developmental series of the thylacine throughout its pouch life using X-ray computed tomography on all known ethanol-preserved specimens. Based on detailed skeletal measurements, we refine the species growth curve to improve age estimates for the individuals. Comparison of allometric growth trends in the appendicular skeleton (fore- and hindlimbs) with that of other placental and marsupial mammals revealed that despite their unique adult morphologies, thylacines retained a generalized early marsupial ontogeny. Our approach also revealed mislabelled specimens that possessed large epipubic bones (vestigial in thylacine) and differing vertebral numbers. All of our generated CT models are publicly available, preserving their developmental morphology and providing a novel digital resource for future studies of this unique marsupial. PMID:29515893

  8. A Novel Interaction Between the TLR7 and a Colchicine Derivative Revealed Through a Computational and Experimental Study

    Directory of Open Access Journals (Sweden)

    Francesco Gentile

    2018-02-01

    Full Text Available The Toll-Like Receptor 7 (TLR7 is an endosomal membrane receptor involved in the innate immune system response. Its best-known small molecule activators are imidazoquinoline derivatives such as imiquimod (R-837 and resiquimod (R-848. Recently, an interaction between R-837 and the colchicine binding site of tubulin was reported. To investigate the possibility of an interaction between structural analogues of colchicine and the TLR7, a recent computational model for the dimeric form of the TLR7 receptor was used to determine a possible interaction with a colchicine derivative called CR42-24, active as a tubulin polymerization inhibitor. The estimated values of the binding energy of this molecule with respect to the TLR7 receptor were comparable to the energies of known binders as reported in a previous study. The binding to the TLR7 was further assessed by introducing genetic transformations in the TLR7 gene in cancer cell lines and exposing them to the compound. A negative shift of the IC50 value in terms of cell growth was observed in cell lines carrying the mutated TLR7 gene. The reported study suggests a possible interaction between TLR7 and a colchicine derivative, which can be explored for rational design of new drugs acting on this receptor by using a colchicine scaffold for additional modifications.

  9. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  10. Integrating user studies into computer graphics-related courses.

    Science.gov (United States)

    Santos, B S; Dias, P; Silva, S; Ferreira, C; Madeira, J

    2011-01-01

    This paper presents computer graphics. Computer graphics and visualization are essentially about producing images for a target audience, be it the millions watching a new CG-animated movie or the small group of researchers trying to gain insight into the large amount of numerical data resulting from a scientific experiment. To ascertain the final images' effectiveness for their intended audience or the designed visualizations' accuracy and expressiveness, formal user studies are often essential. In human-computer interaction (HCI), such user studies play a similar fundamental role in evaluating the usability and applicability of interaction methods and metaphors for the various devices and software systems we use.

  11. Real-life applications with membrane computing

    CERN Document Server

    Zhang, Gexiang; Gheorghe, Marian

    2017-01-01

    This book thoroughly investigates the underlying theoretical basis of membrane computing models, and reveals their latest applications. In addition, to date there have been no illustrative case studies or complex real-life applications that capitalize on the full potential of the sophisticated membrane systems computational apparatus; gaps that this book remedies. By studying various complex applications – including engineering optimization, power systems fault diagnosis, mobile robot controller design, and complex biological systems involving data modeling and process interactions – the book also extends the capabilities of membrane systems models with features such as formal verification techniques, evolutionary approaches, and fuzzy reasoning methods. As such, the book offers a comprehensive and up-to-date guide for all researchers, PhDs and undergraduate students in the fields of computer science, engineering and the bio-sciences who are interested in the applications of natural computing models.

  12. Computational predictions of zinc oxide hollow structures

    Science.gov (United States)

    Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi

    2018-03-01

    Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.

  13. Reveal genes functionally associated with ACADS by a network study.

    Science.gov (United States)

    Chen, Yulong; Su, Zhiguang

    2015-09-15

    Establishing a systematic network is aimed at finding essential human gene-gene/gene-disease pathway by means of network inter-connecting patterns and functional annotation analysis. In the present study, we have analyzed functional gene interactions of short-chain acyl-coenzyme A dehydrogenase gene (ACADS). ACADS plays a vital role in free fatty acid β-oxidation and regulates energy homeostasis. Modules of highly inter-connected genes in disease-specific ACADS network are derived by integrating gene function and protein interaction data. Among the 8 genes in ACADS web retrieved from both STRING and GeneMANIA, ACADS is effectively conjoined with 4 genes including HAHDA, HADHB, ECHS1 and ACAT1. The functional analysis is done via ontological briefing and candidate disease identification. We observed that the highly efficient-interlinked genes connected with ACADS are HAHDA, HADHB, ECHS1 and ACAT1. Interestingly, the ontological aspect of genes in the ACADS network reveals that ACADS, HAHDA and HADHB play equally vital roles in fatty acid metabolism. The gene ACAT1 together with ACADS indulges in ketone metabolism. Our computational gene web analysis also predicts potential candidate disease recognition, thus indicating the involvement of ACADS, HAHDA, HADHB, ECHS1 and ACAT1 not only with lipid metabolism but also with infant death syndrome, skeletal myopathy, acute hepatic encephalopathy, Reye-like syndrome, episodic ketosis, and metabolic acidosis. The current study presents a comprehensible layout of ACADS network, its functional strategies and candidate disease approach associated with ACADS network. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Applications of X-ray Computed Tomography and Emission Computed Tomography

    International Nuclear Information System (INIS)

    Seletchi, Emilia Dana; Sutac, Victor

    2005-01-01

    Computed Tomography is a non-destructive imaging method that allows visualization of internal features within non-transparent objects such as sedimentary rocks. Filtering techniques have been applied to circumvent the artifacts and achieve high-quality images for quantitative analysis. High-resolution X-ray computed tomography (HRXCT) can be used to identify the position of the growth axis in speleothems by detecting subtle changes in calcite density between growth bands. HRXCT imagery reveals the three-dimensional variability of coral banding providing information on coral growth and climate over the past several centuries. The Nuclear Medicine imaging technique uses a radioactive tracer, several radiation detectors, and sophisticated computer technologies to understand the biochemical basis of normal and abnormal functions within the brain. The goal of Emission Computed Tomography (ECT) is to accurately determine the three-dimensional radioactivity distribution resulting from the radiopharmaceutical uptake inside the patient instead of the attenuation coefficient distribution from different tissues as obtained from X-ray Computer Tomography. ECT is a very useful tool for investigating the cognitive functions. Because of the low radiation doses associated with Positron Emission Tomography (PET), this technique has been applied in clinical research, allowing the direct study of human neurological diseases. (authors)

  15. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  16. Case Studies in Library Computer Systems.

    Science.gov (United States)

    Palmer, Richard Phillips

    Twenty descriptive case studies of computer applications in a variety of libraries are presented in this book. Computerized circulation, serial and acquisition systems in public, high school, college, university and business libraries are included. Each of the studies discusses: 1) the environment in which the system operates, 2) the objectives of…

  17. Computer analysis of lighting style in fine art: steps towards inter-artist studies

    Science.gov (United States)

    Stork, David G.

    2011-03-01

    Stylometry in visual art-the mathematical description of artists' styles - has been based on a number of properties of works, such as color, brush stroke shape, visual texture, and measures of contours' curvatures. We introduce the concept of quantitative measures of lighting, such as statistical descriptions of spatial coherence, diuseness, and so forth, as properties of artistic style. Some artists of the high Renaissance, such as Leonardo, worked from nature and strove to render illumination "faithfully" photorealists, such as Richard Estes, worked from photographs and duplicated the "physics based" lighting accurately. As such, each had dierent motivations, methodologies, stagings, and "accuracies" in rendering lighting clues. Perceptual studies show that observers are poor judges of properties of lighting in photographs such as consistency (and thus by extension in paintings as well); computer methods such as rigorous cast-shadow analysis, occluding-contour analysis and spherical harmonic based estimation of light fields can be quite accurate. For this reasons, computer lighting analysis can provide a new tools for art historical studies. We review lighting analysis in paintings such as Vermeer's Girl with a pearl earring, de la Tour's Christ in the carpenter's studio, Caravaggio's Magdalen with the smoking flame and Calling of St. Matthew) and extend our corpus to works where lighting coherence is of interest to art historians, such as Caravaggio's Adoration of the Shepherds or Nativity (1609) in the Capuchin church of Santa Maria degli Angeli. Our measure of lighting coherence may help reveal the working methods of some artists and in diachronic studies of individual artists. We speculate on artists and art historical questions that may ultimately profit from future renements to these new computational tools.

  18. Hispanic women overcoming deterrents to computer science: A phenomenological study

    Science.gov (United States)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty

  19. Computational analysis of difenoconazole interaction with soil chitinases

    International Nuclear Information System (INIS)

    Vlǎdoiu, D L; Filimon, M N; Ostafe, V; Isvoran, A

    2015-01-01

    This study focusses on the investigation of the potential binding of the fungicide difenoconazole to soil chitinases using a computational approach. Computational characterization of the substrate binding sites of Serratia marcescens and Bacillus cereus chitinases using Fpocket tool reflects the role of hydrophobic residues for the substrate binding and the high local hydrophobic density of both sites. Molecular docking study reveals that difenoconazole is able to bind to Serratia marcescens and Bacillus cereus chitinases active sites, the binding energies being comparable

  20. Computer processing of dynamic scintigraphic studies

    International Nuclear Information System (INIS)

    Ullmann, V.

    1985-01-01

    The methods are discussed of the computer processing of dynamic scintigraphic studies which were developed, studied or implemented by the authors within research task no. 30-02-03 in nuclear medicine within the five year plan 1981 to 85. This was mainly the method of computer processing radionuclide angiography, phase radioventriculography, regional lung ventilation, dynamic sequential scintigraphy of kidneys and radionuclide uroflowmetry. The problems are discussed of the automatic definition of fields of interest, the methodology of absolute volumes of the heart chamber in radionuclide cardiology, the design and uses are described of the multipurpose dynamic phantom of heart activity for radionuclide angiocardiography and ventriculography developed within the said research task. All methods are documented with many figures showing typical clinical (normal and pathological) and phantom measurements. (V.U.)

  1. Computer processed /sup 99m/Tc-DTPA studies of renal allotransplants

    International Nuclear Information System (INIS)

    Pavel, D.G.; Westerman, B.R.; Bergan, J.J.; Kahan, B.D.

    1976-01-01

    In order to refine the diagnostic possibilities of the radionuclide renal study in transplated patients and to compensate for the nonspecificity of the 131 I-hippuran study in some situation, /sup 99m/Tc-DTPA was used simultaneously for imaging and time-activity curves. For these curves to be significant, appropriate background subtraction had to be made with a simple computer-processing method. The results obtained have shown that it is possible to distinguish marked acute tubular necrosis from milder degrees, thus affording a prognostic index in the immediate postoperative period, when the hippuran data are often nonspecific. Further, the diagnosis and follow-up of acute rejection episodes can be improved by the DTPA processed curves. Although these curves when examined individually do not show a specific pattern for rejection, they may reveal striking evolutionary changes when compared to the previous studies, even when the hippuran curves are unchanged. The physiologic basis for the differences between the two time-activity curves may be related to the differential handling of the two radiopharmaceuticals by the kidney

  2. [A computer-aided image diagnosis and study system].

    Science.gov (United States)

    Li, Zhangyong; Xie, Zhengxiang

    2004-08-01

    The revolution in information processing, particularly the digitizing of medicine, has changed the medical study, work and management. This paper reports a method to design a system for computer-aided image diagnosis and study. Combined with some good idea of graph-text system and picture archives communicate system (PACS), the system was realized and used for "prescription through computer", "managing images" and "reading images under computer and helping the diagnosis". Also typical examples were constructed in a database and used to teach the beginners. The system was developed by the visual developing tools based on object oriented programming (OOP) and was carried into operation on the Windows 9X platform. The system possesses friendly man-machine interface.

  3. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  4. Outcomes from a pilot study using computer-based rehabilitative tools in a military population.

    Science.gov (United States)

    Sullivan, Katherine W; Quinn, Julia E; Pramuka, Michael; Sharkey, Laura A; French, Louis M

    2012-01-01

    Novel therapeutic approaches and outcome data are needed for cognitive rehabilitation for patients with a traumatic brain injury; computer-based programs may play a critical role in filling existing knowledge gaps. Brain-fitness computer programs can complement existing therapies, maximize neuroplasticity, provide treatment beyond the clinic, and deliver objective efficacy data. However, these approaches have not been extensively studied in the military and traumatic brain injury population. Walter Reed National Military Medical Center established its Brain Fitness Center (BFC) in 2008 as an adjunct to traditional cognitive therapies for wounded warriors. The BFC offers commercially available "brain-training" products for military Service Members to use in a supportive, structured environment. Over 250 Service Members have utilized this therapeutic intervention. Each patient receives subjective assessments pre and post BFC participation including the Mayo-Portland Adaptability Inventory-4 (MPAI-4), the Neurobehavioral Symptom Inventory (NBSI), and the Satisfaction with Life Scale (SWLS). A review of the first 29 BFC participants, who finished initial and repeat measures, was completed to determine the effectiveness of the BFC program. Two of the three questionnaires of self-reported symptom change completed before and after participation in the BFC revealed a statistically significant reduction in symptom severity based on MPAI and NBSI total scores (p < .05). There were no significant differences in the SWLS score. Despite the typical limitations of a retrospective chart review, such as variation in treatment procedures, preliminary results reveal a trend towards improved self-reported cognitive and functional symptoms.

  5. Computational integration of homolog and pathway gene module expression reveals general stemness signatures.

    Directory of Open Access Journals (Sweden)

    Martina Koeva

    Full Text Available The stemness hypothesis states that all stem cells use common mechanisms to regulate self-renewal and multi-lineage potential. However, gene expression meta-analyses at the single gene level have failed to identify a significant number of genes selectively expressed by a broad range of stem cell types. We hypothesized that stemness may be regulated by modules of homologs. While the expression of any single gene within a module may vary from one stem cell type to the next, it is possible that the expression of the module as a whole is required so that the expression of different, yet functionally-synonymous, homologs is needed in different stem cells. Thus, we developed a computational method to test for stem cell-specific gene expression patterns from a comprehensive collection of 49 murine datasets covering 12 different stem cell types. We identified 40 individual genes and 224 stemness modules with reproducible and specific up-regulation across multiple stem cell types. The stemness modules included families regulating chromatin remodeling, DNA repair, and Wnt signaling. Strikingly, the majority of modules represent evolutionarily related homologs. Moreover, a score based on the discovered modules could accurately distinguish stem cell-like populations from other cell types in both normal and cancer tissues. This scoring system revealed that both mouse and human metastatic populations exhibit higher stemness indices than non-metastatic populations, providing further evidence for a stem cell-driven component underlying the transformation to metastatic disease.

  6. Case studies in intelligent computing achievements and trends

    CERN Document Server

    Issac, Biju

    2014-01-01

    Although the field of intelligent systems has grown rapidly in recent years, there has been a need for a book that supplies a timely and accessible understanding of this important technology. Filling this need, Case Studies in Intelligent Computing: Achievements and Trends provides an up-to-date introduction to intelligent systems.This edited book captures the state of the art in intelligent computing research through case studies that examine recent developments, developmental tools, programming, and approaches related to artificial intelligence (AI). The case studies illustrate successful ma

  7. Computer Assisted Instruction in Special Education Three Case Studies

    Directory of Open Access Journals (Sweden)

    İbrahim DOĞAN

    2015-09-01

    Full Text Available The purpose of this study is to investigate the computer use of three students attending the special education center. Students have mental retardation, hearing problem and physical handicap respectively. The maximum variation sampling is used to select the type of handicap while the convenience sampling is used to select the participants. Three widely encountered handicap types in special education are chosen to select the study participants. The multiple holistic case study design is used in the study. Results of the study indicate that teachers in special education prefer to use educational games and drill and practice type of computers programs. Also it is found that over use of the animation, text and symbols cause cognitive overload on the student with mental retardation. Additionally, it is also discovered that the student with hearing problem learn words better when the computers are used in education as compared to the traditional method. Furthermore the student with physical handicap improved his fine muscle control abilities besides planned course objectives when computers are used in special education.

  8. US QCD computational performance studies with PERI

    International Nuclear Information System (INIS)

    Zhang, Y; Fowler, R; Huck, K; Malony, A; Porterfield, A; Reed, D; Shende, S; Taylor, V; Wu, X

    2007-01-01

    We report on some of the interactions between two SciDAC projects: The National Computational Infrastructure for Lattice Gauge Theory (USQCD), and the Performance Engineering Research Institute (PERI). Many modern scientific programs consistently report the need for faster computational resources to maintain global competitiveness. However, as the size and complexity of emerging high end computing (HEC) systems continue to rise, achieving good performance on such systems is becoming ever more challenging. In order to take full advantage of the resources, it is crucial to understand the characteristics of relevant scientific applications and the systems these applications are running on. Using tools developed under PERI and by other performance measurement researchers, we studied the performance of two applications, MILC and Chroma, on several high performance computing systems at DOE laboratories. In the case of Chroma, we discuss how the use of C++ and modern software engineering and programming methods are driving the evolution of performance tools

  9. Hypertensive disease and renal hypertensions: renal structural and functional studies by using dynamic computed tomography

    International Nuclear Information System (INIS)

    Arabidze, G.G.; Pogrebnaya, G.N.; Todua, F.I.; Sokolova, R.I.; Kozdoba, O.A.

    1989-01-01

    Dynamic computed tomography was conducted by the original methods; the findings were analyzed by taking into account time-density curves which made it possible to gain an insight into the status of blood flow and filtration in each individual kidney. Computed tomography and dynamic computed tomography revealed that hypertensive disease was characterized by normal volume and thickness of the renal cortical layer and symmetric time-density curves, whereas a hypertensive type of chronic glomerulonephritis featured lower renal cartical layer thickness, reduced renal volume, symmetrically decrease amplitudes of the first and second peaks of the time-density curve, chronic pyelonephritis showed asymmetric time-density diagrams due to the lower density areas in the afflicted kidney

  10. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    Science.gov (United States)

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  11. Aberration studies and computer algebra

    International Nuclear Information System (INIS)

    Hawkes, P.W.

    1981-01-01

    The labour of calculating expressions for aberration coefficients is considerably lightened if a computer algebra language is used to perform the various substitutions and expansions involved. After a brief discussion of matrix representations of aberration coefficients, a particular language, which has shown itself to be well adapted to particle optics, is described and applied to the study of high frequency cavity lenses. (orig.)

  12. An Examination of Computer Engineering Students' Perceptions about Asynchronous Discussion Forums

    Science.gov (United States)

    Ozyurt, Ozcan; Ozyurt, Hacer

    2013-01-01

    This study was conducted in order to reveal the usage profiles and perceptions of Asynchronous Discussion Forums (ADFs) of 126 computer engineering students from the Computer Engineering Department in a university in Turkey. By using a mixed methods research design both quantitative and qualitative data were collected and analyzed. Research…

  13. Children as Educational Computer Game Designers: An Exploratory Study

    Science.gov (United States)

    Baytak, Ahmet; Land, Susan M.; Smith, Brian K.

    2011-01-01

    This study investigated how children designed computer games as artifacts that reflected their understanding of nutrition. Ten 5th grade students were asked to design computer games with the software "Game Maker" for the purpose of teaching 1st graders about nutrition. The results from the case study show that students were able to…

  14. FORMING SCHOOLCHILD’S PERSONALITY IN COMPUTER STUDY LESSONS AT PRIMARY SCHOOL

    Directory of Open Access Journals (Sweden)

    Natalia Salan

    2017-04-01

    Full Text Available The influence of computer on the formation of primary schoolchildren’s personality and their implementing into learning activity are considered in the article. Based on the materials of state standards and the Law of Ukraine on Higher Education the concepts “computer”, “information culture” are defined, modern understanding of the concept “basics of computer literacy” is identified. The main task of school propaedeutic course in Computer Studies is defined. Interactive methods of activity are singled out. They are didactic games, designing, research, collaboration in pairs, and group interaction, etc. The essential characteristics of didactic game technologies are distinguished, the peculiarities of their use at primary school in Computer Study lessons are analyzed. Positive and negative aspects of using these technologies in Computer Study lessons are defined. The expediency of using game technologies while organizing students’ educational and cognitive activity in Computer Studies is substantiated. The idea to create a school course “Computer Studies at primary school” is caused by the wide introduction of computer technics into the educational system. Today’s schoolchild has to be able to use a computer as freely and easily as he can use a pen, a pencil or a ruler. That’s why it is advisable to start studying basics of Computer Studies at the primary school age. This course is intended for the pupils of the 2nd-4th forms. Firstly, it provides mastering practical skills of computer work and, secondly, it anticipates the development of children’s logical and algorithmic thinking styles. At these lessons students acquire practical skills to work with information on the computer. Having mastered the computer skills at primary school, children will be able to use it successfully in their work. In senior classes they will be able to realize acquired knowledge of the methods of work with information, ways of problem solving

  15. Factors influencing exemplary science teachers' levels of computer use

    Science.gov (United States)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  16. Acting without seeing: Eye movements reveal visual processing without awareness Miriam Spering & Marisa Carrasco

    OpenAIRE

    Spering, Miriam; Carrasco, Marisa

    2015-01-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. We review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movements. Such dissociations reveal situations in which eye movements are sensitive to part...

  17. A Visualization Review of Cloud Computing Algorithms in the Last Decade

    Directory of Open Access Journals (Sweden)

    Junhu Ruan

    2016-10-01

    Full Text Available Cloud computing has competitive advantages—such as on-demand self-service, rapid computing, cost reduction, and almost unlimited storage—that have attracted extensive attention from both academia and industry in recent years. Some review works have been reported to summarize extant studies related to cloud computing, but few analyze these studies based on the citations. Co-citation analysis can provide scholars a strong support to identify the intellectual bases and leading edges of a specific field. In addition, advanced algorithms, which can directly affect the availability, efficiency, and security of cloud computing, are the key to conducting computing across various clouds. Motivated by these observations, we conduct a specific visualization review of the studies related to cloud computing algorithms using one mainstream co-citation analysis tool—CiteSpace. The visualization results detect the most influential studies, journals, countries, institutions, and authors on cloud computing algorithms and reveal the intellectual bases and focuses of cloud computing algorithms in the literature, providing guidance for interested researchers to make further studies on cloud computing algorithms.

  18. UTV Expansion Pack: Special-Purpose Rank-Revealing Algorithms

    DEFF Research Database (Denmark)

    Fierro, Ricardo D.; Hansen, Per Christian

    2005-01-01

    This collection of Matlab 7.0 software supplements and complements the package UTV Tools from 1999, and includes implementations of special-purpose rank-revealing algorithms developed since the publication of the original package. We provide algorithms for computing and modifying symmetric rank-r...... values of a sparse or structured matrix. These new algorithms have applications in signal processing, optimization and LSI information retrieval.......This collection of Matlab 7.0 software supplements and complements the package UTV Tools from 1999, and includes implementations of special-purpose rank-revealing algorithms developed since the publication of the original package. We provide algorithms for computing and modifying symmetric rank......-revealing VSV decompositions, we expand the algorithms for the ULLV decomposition of a matrix pair to handle interference-type problems with a rank-deficient covariance matrix, and we provide a robust and reliable Lanczos algorithm which - despite its simplicity - is able to capture all the dominant singular...

  19. The effect of Vaccinium uliginosum extract on tablet computer-induced asthenopia: randomized placebo-controlled study.

    Science.gov (United States)

    Park, Choul Yong; Gu, Namyi; Lim, Chi-Yeon; Oh, Jong-Hyun; Chang, Minwook; Kim, Martha; Rhee, Moo-Yong

    2016-08-18

    To investigate the alleviation effect of Vaccinium uliginosum extract (DA9301) on tablet computer-induced asthenopia. This was a randomized, placebo-controlled, double-blind and parallel study (Trial registration number: 2013-95). A total 60 volunteers were randomized into DA9301 (n = 30) and control (n = 30) groups. The DA9301 group received DA9301 oral pill (1000 mg/day) for 4 weeks and the control group received placebo. Asthenopia was evaluated by administering a questionnaire containing 10 questions (responses were scored on a scales of 0-6; total score: 60) regarding ocular symptoms before (baseline) and 4 weeks after receiving pills (DA9301 or placebo). The participants completed the questionnaire before and after tablet computer (iPad Air, Apple Inc.) watching at each visit. The change in total asthenopia score (TAS) was calculated and compared between the groups TAS increased significantly after tablet computer watching at baseline in DA9301 group. (from 20.35 to 23.88; p = 0.031) However, after receiving DA9301 for 4 weeks, TAS remained stable after tablet computer watching. In the control group, TAS changes induced by tablet computer watching were not significant both at baseline and at 4 weeks after receiving placebo. Further analysis revealed the scores for "tired eyes" (p = 0.001), "sore/aching eyes" (p = 0.038), "irritated eyes" (p = 0.010), "watery eyes" (p = 0.005), "dry eyes" (p = 0.003), "eye strain" (p = 0.006), "blurred vision" (p = 0.034), and "visual discomfort" (p = 0.018) significantly improved in the DA9301 group. We found that oral intake of DA9301 (1000 mg/day for 4 weeks) was effective in alleviating asthenopia symptoms induced by tablet computer watching. The study is registered at www.clinicaltrials.gov (registration number: NCT02641470, date of registration December 30, 2015).

  20. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    CERN Document Server

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  1. Prevalence of computer vision syndrome in Erbil

    Directory of Open Access Journals (Sweden)

    Dler Jalal Ahmed

    2018-04-01

    Full Text Available Background and objective: Nearly all colleges, universities and homes today are regularly using video display terminals, such as computer, iPad, mobile, and TV. Very little research has been carried out on Kurdish users to reveal the effect of video display terminals on the eye and vision. This study aimed to evaluate the prevalence of computer vision syndrome among computer users. Methods: A hospital based cross-sectional study was conducted in the Ophthalmology Department of Rizgary and Erbil teaching hospitals in Erbil city. Those used computers in the months preceding the date of this study were included in the study. Results: Among 173 participants aged between 8 to 48 years (mean age of 23.28±6.6 years, the prevalence of computer vision syndrome found to be 89.65%. The most disturbing symptom was eye irritation (79.8%, followed by blurred vision(75.7%. Participants who were using visual display terminals for more than six hours per day were at higher risk of developing nearly all symptoms of computer vision syndrome. Significant correlation was found between time-consuming on computer and symptoms such as headache (P <0.001, redness (P <0.001, eye irritation (P <0.001, blurred vision (P <0.001 and neck pain (P <0.001. Conclusion: The present study demonstrates that more than three-fourths of the participants had one of the symptoms of computer vision syndrome while working on visual display terminals. Keywords: Computer vision syndrome; Headache; Neck pain; Blurred vision.

  2. Computed tomography in the evaluation of acquired stenosis in the neonate

    International Nuclear Information System (INIS)

    Faw, K.; Muntz, H.; Siegel, M.; Spector, G.

    1982-01-01

    We studied the feasibility of computed tomographic evaluation of the neonatal airway. Three neonatal larynges, removed at necroscopy, were examined by computed tomography. Good resolution of soft tissue, cartilage and airway lumen was obtained in these small specimens. On the basis of these findings two neonates with acquired subglottic stenosis were examined by endoscopy, soft tissue airway radiographs, and computed tomography. Measurements of radiation dose revealed that a computed tomographic study delivered 36% of the mean tissue dose of standard image intensifier fluoroscopy. Computed tomography and fluoroscopy both demonstrated the degree and length of this stenosis accurately. An advantage of CT over conventional imaging procedures was better definition of the cross sectional area of the airway

  3. A novel quantum scheme for secure two-party distance computation

    Science.gov (United States)

    Peng, Zhen-wan; Shi, Run-hua; Zhong, Hong; Cui, Jie; Zhang, Shun

    2017-12-01

    Secure multiparty computational geometry is an essential field of secure multiparty computation, which computes a computation geometric problem without revealing any private information of each party. Secure two-party distance computation is a primitive of secure multiparty computational geometry, which computes the distance between two points without revealing each point's location information (i.e., coordinate). Secure two-party distance computation has potential applications with high secure requirements in military, business, engineering and so on. In this paper, we present a quantum solution to secure two-party distance computation by subtly using quantum private query. Compared to the classical related protocols, our quantum protocol can ensure higher security and better privacy protection because of the physical principle of quantum mechanics.

  4. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    Science.gov (United States)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  5. Algebraic computing program for studying the gauge theory

    International Nuclear Information System (INIS)

    Zet, G.

    2005-01-01

    An algebraic computing program running on Maple V platform is presented. The program is devoted to the study of the gauge theory with an internal Lie group as local symmetry. The physical quantities (gauge potentials, strength tensors, dual tensors etc.) are introduced either as equations in terms of previous defined quantities (tensors), or by manual entry of the component values. The components of the strength tensor and of its dual are obtained with respect to a given metric of the space-time used for describing the gauge theory. We choose a Minkowski space-time endowed with spherical symmetry and give some example of algebraic computing that are adequate for studying electroweak or gravitational interactions. The field equations are also obtained and their solutions are determined using the DEtools facilities of the Maple V computing program. (author)

  6. Studi Perbandingan Layanan Cloud Computing

    OpenAIRE

    Afdhal, Afdhal

    2013-01-01

    In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud d...

  7. Preparation, characterization, drug release and computational modelling studies of antibiotics loaded amorphous chitin nanoparticles.

    Science.gov (United States)

    Gayathri, N K; Aparna, V; Maya, S; Biswas, Raja; Jayakumar, R; Mohan, C Gopi

    2017-12-01

    We present a computational investigation of binding affinity of different types of drugs with chitin nanocarriers. Understanding the chitn polymer-drug interaction is important to design and optimize the chitin based drug delivery systems. The binding affinity of three different types of anti-bacterial drugs Ethionamide (ETA) Methacycline (MET) and Rifampicin (RIF) with amorphous chitin nanoparticles (AC-NPs) were studied by integrating computational and experimental techniques. The binding energies (BE) of hydrophobic ETA, hydrophilic MET and hydrophobic RIF were -7.3kcal/mol, -5.1kcal/mol and -8.1kcal/mol respectively, with respect to AC-NPs, using molecular docking studies. This theoretical result was in good correlation with the experimental studies of AC-drug loading and drug entrapment efficiencies of MET (3.5±0.1 and 25± 2%), ETA (5.6±0.02 and 45±4%) and RIF (8.9±0.20 and 53±5%) drugs respectively. Stability studies of the drug encapsulated nanoparticles showed stable values of size, zeta and polydispersity index at 6°C temperature. The correlation between computational BE and experimental drug entrapment efficiencies of RIF, ETA and MET drugs with four AC-NPs strands were 0.999 respectively, while that of the drug loading efficiencies were 0.854 respectively. Further, the molecular docking results predict the atomic level details derived from the electrostatic, hydrogen bonding and hydrophobic interactions of the drug and nanoparticle for its encapsulation and loading in the chitin-based host-guest nanosystems. The present results thus revealed the drug loading and drug delivery insights and has the potential of reducing the time and cost of processing new antibiotic drug delivery nanosystem optimization, development and discovery. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Does computer use affect the incidence of distal arm pain? A one-year prospective study using objective measures of computer use

    DEFF Research Database (Denmark)

    Mikkelsen, S.; Lassen, C. F.; Vilstrup, Imogen

    2012-01-01

    PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded with a soft......PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded...... with a software program installed on the participants' computers. Participants reported weekly pain scores via the software program for elbow, forearm and wrist/hand as well as in a questionnaire at baseline and 1-year follow up. Associations between pain development and computer work were examined for three pain...... were not risk factors for acute pain, nor did they modify the effects of mouse or keyboard time. Computer usage parameters were not associated with prolonged or chronic pain. A major limitation of the study was low keyboard times. CONCLUSION: Computer work was not related to the development...

  9. Enhanced delegated computing using coherence

    Science.gov (United States)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  10. Computational study of a High Pressure Turbine Nozzle/Blade Interaction

    Science.gov (United States)

    Kopriva, James; Laskowski, Gregory; Sheikhi, Reza

    2015-11-01

    A downstream high pressure turbine blade has been designed for this study to be coupled with the upstream uncooled nozzle of Arts and Rouvroit [1992]. The computational domain is first held to a pitch-line section that includes no centrifugal forces (linear sliding-mesh). The stage geometry is intended to study the fundamental nozzle/blade interaction in a computationally cost efficient manner. Blade/Nozzle count of 2:1 is designed to maintain computational periodic boundary conditions for the coupled problem. Next the geometry is extended to a fully 3D domain with endwalls to understand the impact of secondary flow structures. A set of systematic computational studies are presented to understand the impact of turbulence on the nozzle and down-stream blade boundary layer development, resulting heat transfer, and downstream wake mixing in the absence of cooling. Doing so will provide a much better understanding of stage mixing losses and wall heat transfer which, in turn, can allow for improved engine performance. Computational studies are performed using WALE (Wale Adapted Local Eddy), IDDES (Improved Delayed Detached Eddy Simulation), SST (Shear Stress Transport) models in Fluent.

  11. Does computer use affect the incidence of distal arm pain? A one-year prospective study using objective measures of computer use

    DEFF Research Database (Denmark)

    Mikkelsen, Sigurd; Lassen, Christina Funch; Vilstrup, Imogen

    2012-01-01

    PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded with a soft......PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded...... with a software program installed on the participants' computers. Participants reported weekly pain scores via the software program for elbow, forearm and wrist/hand as well as in a questionnaire at baseline and 1-year follow up. Associations between pain development and computer work were examined for three pain...... were not risk factors for acute pain, nor did they modify the effects of mouse or keyboard time. Computer usage parameters were not associated with prolonged or chronic pain. A major limitation of the study was low keyboard times. CONCLUSION: Computer work was not related to the development...

  12. A Comparison of the Learning Outcomes of Traditional Lecturing with that of Computer-Based Learning in two Optometry Courses

    Directory of Open Access Journals (Sweden)

    H Kangari

    2009-07-01

    Full Text Available Background and purpose: The literature on distance education has provided different reports about the effectiveness of traditional lecture based settings versus computer based study settings. This studyis an attempt to compare the learning outcomes of the traditional lecture based teaching with that of the computer based learning in the optometry curriculum.Methods: Two courses in the optometry curriculum, Optometry I, with 24 students and Optometry II, with 27 students were used in this study. In each course, the students were randomly divided into two groups. In each scheduled class session, one group randomly attended the lecture, while the other studied in the computer stations. The same content was presented to both groups and at end of each session the same quiz was given to both. In the next session, the groups switched place. This processcontinued for four weeks. The quizzes were scored and a paired t-test was used to examine any difference. The data was analyzed by SPSS 15 software.Results: The mean score for Optometry I, lecture settings was 3.36 +0.59, for Optometry I computer based study was 3.27+0.63 , for Optometry II, in lecture setting was 3.22+0.57 and for Optometry II, computer based setting was 2.85+0.69. The paired sample t-test was performed on the scores, revealing no statistical significant difference between the two settings. However, the mean score for lecture sessions was slightly higher in lecture settings.Conclusion: Since this study reveals that the learning outcomes in traditional lecture based settings and computer based study are not significantly different, the lecture sessions can be safely replacedby the computer based study session. Further practice in the computer based setting might reveal better outcomes in computer study settings.Key words: LECTURING, COMPUTER BASED LEARNING, DISTANCE EDUCATION

  13. Revealing fatigue damage evolution in unidirectional composites for wind turbine blades using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    ’. Thereby, it will be possible to lower the cost of energy for wind energy based electricity. In the presented work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre...... to other comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  14. Mechanical influences on morphogenesis of the knee joint revealed through morphological, molecular and computational analysis of immobilised embryos.

    Directory of Open Access Journals (Sweden)

    Karen A Roddy

    2011-02-01

    Full Text Available Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint.

  15. Mechanical Influences on Morphogenesis of the Knee Joint Revealed through Morphological, Molecular and Computational Analysis of Immobilised Embryos

    Science.gov (United States)

    Roddy, Karen A.; Prendergast, Patrick J.; Murphy, Paula

    2011-01-01

    Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint. PMID:21386908

  16. NASA Computational Case Study: The Flight of Friendship 7

    Science.gov (United States)

    Simpson, David G.

    2012-01-01

    In this case study, we learn how to compute the position of an Earth-orbiting spacecraft as a function of time. As an exercise, we compute the position of John Glenn's Mercury spacecraft Friendship 7 as it orbited the Earth during the third flight of NASA's Mercury program.

  17. Acting without seeing: eye movements reveal visual processing without awareness.

    Science.gov (United States)

    Spering, Miriam; Carrasco, Marisa

    2015-04-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. Here, we review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movement. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging, and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Using NCLab-karel to improve computational thinking skill of junior high school students

    Science.gov (United States)

    Kusnendar, J.; Prabawa, H. W.

    2018-05-01

    Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.

  19. A new computational method for studies of 3-D dislocation-precipitate interactions in reactor steels

    International Nuclear Information System (INIS)

    Takahashi, A.; Gohniem, N.M.

    2008-01-01

    To enable computational design of advanced steels for reactor pressure vessels and core structural components, we present a new computational method for studies of the interaction between dislocations and precipitates. The method is based on three-dimensional parametric dislocation dynamics, Eshelby's inclusion and inhomogeneity solutions, and boundary and volume element numerical models. Results from this new method are successfully compared to recent molecular dynamics (MD) simulation results, and show good agreement with atomistic simulations. Then the method is first applied to the investigation of the critical shear stress (CSS) of precipitates sheared by successive dislocation cuttings. The simulations reveal that the CSS is reduced when dislocations cut precipitates, and that it can be as low as half the original value for a completely sheared precipitate. The influence of precipitate geometry and the ratio of precipitate-to-matrix elastic shear modulus on the CSS is presented, and the dependence of the interaction stress between dislocations and precipitates on their relative geometry is discussed. Finally an extension of the method to incorporate the dislocation core contribution to the CSS is highlighted. (author)

  20. Computer mapping as an aid in air-pollution studies: Montreal region study

    Energy Technology Data Exchange (ETDEWEB)

    Granger, J M

    1972-01-01

    Through the use of computer-mapping programs, an operational technique has been designed which allows an almost-instant appraisal of the intensity of atmospheric pollution in an urban region on the basis of epiphytic sensitivity. The epiphytes considered are essentially lichens and mosses growing on trees. This study was applied to the Montreal region, with 349 samplings statiions distributed nearly uniformly. Computer graphics of the findings are included in the appendix.

  1. Study on GPU Computing for SCOPE2 with CUDA

    International Nuclear Information System (INIS)

    Kodama, Yasuhiro; Tatsumi, Masahiro; Ohoka, Yasunori

    2011-01-01

    For improving safety and cost effectiveness of nuclear power plants, a core calculation code SCOPE2 has been developed, which adopts detailed calculation models such as the multi-group nodal SP3 transport calculation method in three-dimensional pin-by-pin geometry to achieve high predictability. However, it is difficult to apply the code to loading pattern optimizations since it requires much longer computation time than that of codes based on the nodal diffusion method which is widely used in core design calculations. In this study, we studied possibility of acceleration of SCOPE2 with GPU computing capability which has been recognized as one of the most promising direction of high performance computing. In the previous study with an experimental programming framework, it required much effort to convert the algorithms to ones which fit to GPU computation. It was found, however, that this conversion was tremendously difficult because of the complexity of algorithms and restrictions in implementation. In this study, to overcome this complexity, we utilized the CUDA programming environment provided by NVIDIA which is a versatile and flexible language as an extension to the C/C++ languages. It was confirmed that we could enjoy high performance without degradation of maintainability through test implementation of GPU kernels for neutron diffusion/simplified P3 equation solvers. (author)

  2. Ambient belonging: how stereotypical cues impact gender participation in computer science.

    Science.gov (United States)

    Cheryan, Sapna; Plaut, Victoria C; Davies, Paul G; Steele, Claude M

    2009-12-01

    People can make decisions to join a group based solely on exposure to that group's physical environment. Four studies demonstrate that the gender difference in interest in computer science is influenced by exposure to environments associated with computer scientists. In Study 1, simply changing the objects in a computer science classroom from those considered stereotypical of computer science (e.g., Star Trek poster, video games) to objects not considered stereotypical of computer science (e.g., nature poster, phone books) was sufficient to boost female undergraduates' interest in computer science to the level of their male peers. Further investigation revealed that the stereotypical broadcast a masculine stereotype that discouraged women's sense of ambient belonging and subsequent interest in the environment (Studies 2, 3, and 4) but had no similar effect on men (Studies 3, 4). This masculine stereotype prevented women's interest from developing even in environments entirely populated by other women (Study 2). Objects can thus come to broadcast stereotypes of a group, which in turn can deter people who do not identify with these stereotypes from joining that group.

  3. Penerapan Teknologi Cloud Computing Di Universitas Studi Kasus: Fakultas Teknologi Informasi Ukdw

    OpenAIRE

    Kurniawan, Erick

    2015-01-01

    Teknologi Cloud Computing adalah paradigma baru dalam penyampaian layanan komputasi. Cloud Computing memiliki banyak kelebihan dibandingkan dengan sistem konvensional. Artikel ini membahas tentang arsitektur cloud computing secara umum dan beberapa contoh penerapan layanan cloud computing beserta manfaatnya di lingkungan universitas. Studi kasus yang diambil adalah penerapan layanan cloud computing di Fakultas Teknologi Informasi UKDW.

  4. Landmark Study Reveals Antarctic Glacier's Long History of Retreat

    OpenAIRE

    Kuska, Dale M.

    2016-01-01

    Faculty Showcase Archive Article Approved for public release; distribution is unlimited. A major study, released in late November in the journal “Nature,” reveals the history of retreat of the massive Pine Island Glacier (PIG) in western Antarctica, widely considered one of the largest contributors to global sea-level rise.

  5. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  6. Mind the gap: an attempt to bridge computational and neuroscientific approaches to study creativity

    Science.gov (United States)

    Wiggins, Geraint A.; Bhattacharya, Joydeep

    2014-01-01

    Creativity is the hallmark of human cognition and is behind every innovation, scientific discovery, piece of music, artwork, and idea that have shaped our lives, from ancient times till today. Yet scientific understanding of creative processes is quite limited, mostly due to the traditional belief that considers creativity as a mysterious puzzle, a paradox, defying empirical enquiry. Recently, there has been an increasing interest in revealing the neural correlates of human creativity. Though many of these studies, pioneering in nature, help demystification of creativity, but the field is still dominated by popular beliefs in associating creativity with “right brain thinking”, “divergent thinking”, “altered states” and so on (Dietrich and Kanso, 2010). In this article, we discuss a computational framework for creativity based on Baars’ Global Workspace Theory (GWT; Baars, 1988) enhanced with mechanisms based on information theory. Next we propose a neurocognitive architecture of creativity with a strong focus on various facets (i.e., unconscious thought theory, mind wandering, spontaneous brain states) of un/pre-conscious brain responses. Our principal argument is that pre-conscious creativity happens prior to conscious creativity and the proposed computational model may provide a mechanism by which this transition is managed. This integrative approach, albeit unconventional, will hopefully stimulate future neuroscientific studies of the inscrutable phenomenon of creativity. PMID:25104930

  7. Revisiting dibenzothiophene thermochemical data: Experimental and computational studies

    International Nuclear Information System (INIS)

    Freitas, Vera L.S.; Gomes, Jose R.B.; Ribeiro da Silva, Maria D.M.C.

    2009-01-01

    Thermochemical data of dibenzothiophene were studied in the present work by experimental techniques and computational calculations. The standard (p 0 =0.1MPa) molar enthalpy of formation, at T = 298.15 K, in the gaseous phase, was determined from the enthalpy of combustion and sublimation, obtained by rotating bomb calorimetry in oxygen, and by Calvet microcalorimetry, respectively. This value was compared with estimated data from G3(MP2)//B3LYP computations and also with the other results available in the literature.

  8. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  9. An Experimental Study into the use of computers for teaching of ...

    African Journals Online (AJOL)

    This study was an experimental study which sought to establish how English language teachers used computers for teaching composition writing at Prince Edward High School in Harare. The findings of the study show that computers were rarely used in the teaching of composition despite the observation that the school ...

  10. Information Warfare: Issues Associated with the Defense of DOD Computers and Computer Networks

    National Research Council Canada - National Science Library

    Franklin, Derek

    2002-01-01

    ... that may threaten the critical information pathways of the armed forces An analysis of the history of computer information warfare reveals that there was an embarrassing lack of readiness and defense...

  11. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    Science.gov (United States)

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  12. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  13. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  14. Changes in bone macro- and microstructure in diabetic obese mice revealed by high resolution microfocus X-ray computed tomography

    Science.gov (United States)

    Kerckhofs, G.; Durand, M.; Vangoitsenhoven, R.; Marin, C.; van der Schueren, B.; Carmeliet, G.; Luyten, F. P.; Geris, L.; Vandamme, K.

    2016-10-01

    High resolution microfocus X-ray computed tomography (HR-microCT) was employed to characterize the structural alterations of the cortical and trabecular bone in a mouse model of obesity-driven type 2 diabetes (T2DM). C57Bl/6J mice were randomly assigned for 14 weeks to either a control diet-fed (CTRL) or a high fat diet (HFD)-fed group developing obesity, hyperglycaemia and insulin resistance. The HFD group showed an increased trabecular thickness and a decreased trabecular number compared to CTRL animals. Midshaft tibia intracortical porosity was assessed at two spatial image resolutions. At 2 μm scale, no change was observed in the intracortical structure. At 1 μm scale, a decrease in the cortical vascular porosity of the HFD bone was evidenced. The study of a group of 8 week old animals corresponding to animals at the start of the diet challenge revealed that the decreased vascular porosity was T2DM-dependant and not related to the ageing process. Our results offer an unprecedented ultra-characterization of the T2DM compromised skeletal micro-architecture and highlight an unrevealed T2DM-related decrease in the cortical vascular porosity, potentially affecting the bone health and fragility. Additionally, it provides some insights into the technical challenge facing the assessment of the rodent bone structure using HR-microCT imaging.

  15. Changes in bone macro- and microstructure in diabetic obese mice revealed by high resolution microfocus X-ray computed tomography

    Science.gov (United States)

    Kerckhofs, G.; Durand, M.; Vangoitsenhoven, R.; Marin, C.; Van der Schueren, B.; Carmeliet, G.; Luyten, F. P.; Geris, L.; Vandamme, K.

    2016-01-01

    High resolution microfocus X-ray computed tomography (HR-microCT) was employed to characterize the structural alterations of the cortical and trabecular bone in a mouse model of obesity-driven type 2 diabetes (T2DM). C57Bl/6J mice were randomly assigned for 14 weeks to either a control diet-fed (CTRL) or a high fat diet (HFD)-fed group developing obesity, hyperglycaemia and insulin resistance. The HFD group showed an increased trabecular thickness and a decreased trabecular number compared to CTRL animals. Midshaft tibia intracortical porosity was assessed at two spatial image resolutions. At 2 μm scale, no change was observed in the intracortical structure. At 1 μm scale, a decrease in the cortical vascular porosity of the HFD bone was evidenced. The study of a group of 8 week old animals corresponding to animals at the start of the diet challenge revealed that the decreased vascular porosity was T2DM-dependant and not related to the ageing process. Our results offer an unprecedented ultra-characterization of the T2DM compromised skeletal micro-architecture and highlight an unrevealed T2DM-related decrease in the cortical vascular porosity, potentially affecting the bone health and fragility. Additionally, it provides some insights into the technical challenge facing the assessment of the rodent bone structure using HR-microCT imaging. PMID:27759061

  16. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  17. Study Of Visual Disorders In Egyptian Computer Operators

    International Nuclear Information System (INIS)

    Al-Awadi, M.Y.; Awad Allah, H.; Hegazy, M. T.; Naguib, N.; Akmal, M.

    2012-01-01

    The aim of the study was to evaluate the probable effects of exposure to electromagnetic waves radiated from visual display terminals on some of visual functions. 300 computer operators working in different institutes were selected randomly. They were asked to fill a pre-tested questionnaire (written in Arabic) after obtaining their verbal consent. Among them, one hundred fifty exposed to visual display terminals were selected for the clinical study (group I). The control group includes one hundred fifty participants (their age matched with group I) but working in a field that did not expose to visual display terminals (group II). All chosen individuals were not suffering from any apparent health problems or any apparent diseases that could affect their visual conditions. All exposed candidates were using a VDT of LCD type size 15 and 17 and larger. Data entry and analysis were done using the SPSS version 17.0 applying appropriate statistical methods. The results showed that among the 150 exposed studied subjects, high significant occurrence of dryness and high significant association between occurrence of asthenopia and background variables (working hours using computers) were observed. Exposed subjects showed that 92% complained of tired eyes and eye strain, 37.33% complained of dry or sore eyes, 68% complained of headache, 68% complained of blurred distant vision 45.33% complained of asthenopia and 89.33% complained of neck, shoulder and back aches. Meantime, the control group showed that 18% complained of tired eyes, 21.33% of dry eyes and 12.67% of neck, shoulder and back aches. It could be concluded that prevalence of computer vision syndrome was noted to be quite high among computer operators.

  18. Gender Differences in Computer Ethics among Business Administration Students

    Directory of Open Access Journals (Sweden)

    Ali ACILAR

    2010-12-01

    Full Text Available Because of the various benefits and advantages that computers and the Internet offer, these technologies have become an essential part of our daily life. The dependence on these technologies has been continuously and rapidly increasing. Computers and the Internet use also has become an important part for instructional purposes in academic environments. Even though the pervasive use of computers and the Internet has many benefits for almost everyone, but it has also increased the use of these technologies for illegal purposes or unethical activities such as spamming, making illegal copies of software, violations of privacy, hacking and computer viruses. The main purpose of this study is to explore gender differences in computer ethics among Business Administration students and examine their attitudes towards ethical use of computers. Results from 248 students in the Department of Business Administration at a public university in Turkey reveal that significant differences exist between male and female students’ attitudes towards ethical use of computers

  19. Geoid-to-Quasigeoid Separation Computed Using the GRACE/GOCE Global Geopotential Model GOCO02S - A Case Study of Himalayas and Tibet

    Directory of Open Access Journals (Sweden)

    Mohammad Bagherbandi Robert Tenzer

    2013-01-01

    Full Text Available The geoid-to-quasigeoid correction has been traditionally computed approximately as a function of the planar Bouguer gravity anomaly and the topographic height. Recent numerical studies based on newly developed theoretical models, however, indicate that the computation of this correction using the approximate formula yields large errors especially in mountainous regions with computation points at high elevations. In this study we investigate these approximation errors at the study area which comprises Himalayas and Tibet where this correction reaches global maxima. Since the GPS-leveling and terrestrial gravity datasets in this part of the world are not (freely available, global gravitational models (GGMs are used to compute this correction utilizing the expressions for a spherical harmonic analysis of the gravity field. The computation of this correction can be done using the GGM coefficients taken from the Earth Gravitational Model 2008 (EGM08 complete to degree 2160 of spherical harmonics. The recent studies based on a regional accuracy assessment of GGMs have shown that the combined GRACE/GOCE solutions provide a substantial improvement of the Earth¡¦s gravity field at medium wavelengths of spherical harmonics compared to EGM08. We address this aspect in numerical analysis by comparing the gravity field quantities computed using the satellite-only combined GRACE/GOCE model GOCO02S against the EGM08 results. The numerical results reveal that errors in the geoid-to-quasigeoid correction computed using the approximate formula can reach as much as ~1.5 m. We also demonstrate that the expected improvement of the GOCO02S gravity field quantities at medium wavelengths (within the frequency band approximately between 100 and 250 compared to EGM08 is as much as ±60 mGal and ±0.2 m in terms of gravity anomalies and geoid/quasigeoid heights respectively.

  20. Student Study Choices in the Principles of Economics: A Case Study of Computer Usage

    OpenAIRE

    Grimes, Paul W.; Sanderson, Patricia L.; Ching, Geok H.

    1996-01-01

    Principles of Economics students at Mississippi State University were provided the opportunity to use computer assisted instruction (CAI) as a supplemental study activity. Students were free to choose the extent of their computer work. Throughout the course, weekly surveys were conducted to monitor the time each student spent with their textbook, computerized tutorials, workbook, class notes, and study groups. The surveys indicated that only a minority of the students actively pursued CAI....

  1. Revealing the programming process

    DEFF Research Database (Denmark)

    Bennedsen, Jens; Caspersen, Michael Edelgaard

    2005-01-01

    One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because...... the textbook medium is static and therefore ill-suited to expose the process of programming. We have found that process recordings in the form of captured narrated programming sessions are a simple, cheap, and efficient way of providing the revelation.We identify seven different elements of the programming...

  2. Novel Polyurethane Matrix Systems Reveal a Particular Sustained Release Behavior Studied by Imaging and Computational Modeling.

    Science.gov (United States)

    Campiñez, María Dolores; Caraballo, Isidoro; Puchkov, Maxim; Kuentz, Martin

    2017-07-01

    The aim of the present work was to better understand the drug-release mechanism from sustained release matrices prepared with two new polyurethanes, using a novel in silico formulation tool based on 3-dimensional cellular automata. For this purpose, two polymers and theophylline as model drug were used to prepare binary matrix tablets. Each formulation was simulated in silico, and its release behavior was compared to the experimental drug release profiles. Furthermore, the polymer distributions in the tablets were imaged by scanning electron microscopy (SEM) and the changes produced by the tortuosity were quantified and verified using experimental data. The obtained results showed that the polymers exhibited a surprisingly high ability for controlling drug release at low excipient concentrations (only 10% w/w of excipient controlled the release of drug during almost 8 h). The mesoscopic in silico model helped to reveal how the novel biopolymers were controlling drug release. The mechanism was found to be a special geometrical arrangement of the excipient particles, creating an almost continuous barrier surrounding the drug in a very effective way, comparable to lipid or waxy excipients but with the advantages of a much higher compactability, stability, and absence of excipient polymorphism.

  3. Computed Tomography Study Of Complicated Bacterial Meningitis ...

    African Journals Online (AJOL)

    To monitor the structural intracranial complications of bacterial meningitis using computed tomography (CT) scan. Retrospective study of medical and radiological records of patients who underwent CT scan over a 4 year period. AUniversityTeachingHospital in a developing country. Thirty three patients with clinically and ...

  4. Context-aware computing and self-managing systems

    CERN Document Server

    Dargie, Waltenegus

    2009-01-01

    Bringing together an extensively researched area with an emerging research issue, Context-Aware Computing and Self-Managing Systems presents the core contributions of context-aware computing in the development of self-managing systems, including devices, applications, middleware, and networks. The expert contributors reveal the usefulness of context-aware computing in developing autonomous systems that have practical application in the real world.The first chapter of the book identifies features that are common to both context-aware computing and autonomous computing. It offers a basic definit

  5. Brain-computer interfacing under distraction: an evaluation study

    DEFF Research Database (Denmark)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes

    2016-01-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach...

  6. Computer-aided proofs for multiparty computation with active security

    DEFF Research Database (Denmark)

    Haagh, Helene; Karbyshev, Aleksandr; Oechsner, Sabine

    2018-01-01

    Secure multi-party computation (MPC) is a general cryptographic technique that allows distrusting parties to compute a function of their individual inputs, while only revealing the output of the function. It has found applications in areas such as auctioning, email filtering, and secure...... teleconference. Given its importance, it is crucial that the protocols are specified and implemented correctly. In the programming language community it has become good practice to use computer proof assistants to verify correctness proofs. In the field of cryptography, EasyCrypt is the state of the art proof...... public-key encryption, signatures, garbled circuits and differential privacy. Here we show for the first time that it can also be used to prove security of MPC against a malicious adversary. We formalize additive and replicated secret sharing schemes and apply them to Maurer's MPC protocol for secure...

  7. Tablet computers and forensic and correctional psychological assessment: A randomized controlled study.

    Science.gov (United States)

    King, Christopher M; Heilbrun, Kirk; Kim, Na Young; McWilliams, Kellie; Phillips, Sarah; Barbera, Jessie; Fretz, Ralph

    2017-10-01

    Mobile computing technology presents various possibilities and challenges for psychological assessment. Within forensic and correctional psychology, assessment of justice-involved persons facilitated by such technology has not been empirically examined. Accordingly, this randomized controlled experiment involved administering questionnaires about risk-needs, treatment readiness, and computerized technology opinions to a large (N = 212) and diverse sample of individuals under custodial correctional supervision using either a tablet computer or traditional paper-and-pencil materials. Results revealed that participants in the paper-and-pencil condition completed the packet of questionnaires faster but omitted items more frequently. Older participants and those with lower levels of education tended to take longer to complete the tablet-administrated measures. The tablet format was rated as more usable irrespective of demographic and personal characteristics, and most participants across the 2 conditions indicated that they would prefer to use computerized technology to complete psychological testing. Administration format did not have a clear effect on attitudes toward correctional rehabilitation services. Noteworthy for researchers is the substantial time saved and absence of practical problems with the tablet condition. Implications for practitioners include the general usability of the devices, their appeal to incarcerated persons, and the potential for tablets to facilitate clinical and administrative tasks with corrections clients. Considering the novel nature of this study, its promising results, and its limitations, future research in this area is warranted. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  9. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children.

    Science.gov (United States)

    Segev, Aviv; Mimouni-Bloch, Aviva; Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. In a cross-sectional study, 185 parents and children aged 3-18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23-8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07-2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99-1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate

  10. Spectroscopic and computational studies of ionic clusters as models of solvation and atmospheric reactions

    Science.gov (United States)

    Kuwata, Keith T.

    Ionic clusters are useful as model systems for the study of fundamental processes in solution and in the atmosphere. Their structure and reactivity can be studied in detail using vibrational predissociation spectroscopy, in conjunction with high level ab initio calculations. This thesis presents the applications of infrared spectroscopy and computation to a variety of gas-phase cluster systems. A crucial component of the process of stratospheric ozone depletion is the action of polar stratospheric clouds (PSCs) to convert the reservoir species HCl and chlorine nitrate (ClONO2) to photochemically labile compounds. Quantum chemistry was used to explore one possible mechanism by which this activation is effected: Cl- + ClONO2 /to Cl2 + NO3- eqno(1)Correlated ab initio calculations predicted that the direct reaction of chloride ion with ClONO2 is facile, which was confirmed in an experimental kinetics study. In the reaction a weakly bound intermediate Cl2-NO3- is formed, with ~70% of the charge localized on the nitrate moiety. This enables the Cl2-NO3- cluster to be well solvated even in bulk solution, allowing (1) to be facile on PSCs. Quantum chemistry was also applied to the hydration of nitrosonium ion (NO+), an important process in the ionosphere. The calculations, in conjunction with an infrared spectroscopy experiment, revealed the structure of the gas-phase clusters NO+(H2O)n. The large degree of covalent interaction between NO+ and the lone pairs of the H2O ligands is contrasted with the weak electrostatic bonding between iodide ion and H2O. Finally, the competition between ion solvation and solvent self-association is explored for the gas-phase clusters Cl/-(H2O)n and Cl-(NH3)n. For the case of water, vibrational predissociation spectroscopy reveals less hydrogen bonding among H2O ligands than predicted by ab initio calculations. Nevertheless, for n /ge 5, cluster structure is dominated by water-water interactions, with Cl- only partially solvated by the

  11. Computer-Assisted Spanish-Composition Survey--1986.

    Science.gov (United States)

    Harvey, T. Edward

    1986-01-01

    A survey of high school and higher education teachers' (N=208) attitudes regarding the use of computers for Spanish-composition instruction revealed that: the lack of foreign-character support remains the major frustration; most teachers used Apple or IBM computers; and there was mixed opinion regarding the real versus the expected benefits of…

  12. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  13. Detection of the posterior superior alveolar artery in the lateral sinus wall using computed tomography/cone beam computed tomography: a prevalence meta-analysis study and systematic review.

    Science.gov (United States)

    Varela-Centelles, P; Loira-Gago, M; Seoane-Romero, J M; Takkouche, B; Monteiro, L; Seoane, J

    2015-11-01

    A systematic search of MEDLINE, Embase, and Proceedings Web of Science was undertaken to assess the prevalence of the posterior superior alveolar artery (PSAA) in the lateral sinus wall in sinus lift patients, as identified using computed tomography (CT)/cone beam computed tomography (CBCT). For inclusion, the article had to report PSAA detection in the bony wall using CT and/or CBCT in patients with subsinus edentulism. Studies on post-mortem findings, mixed samples (living and cadaveric), those presenting pooled results only, or studies performed for a sinus pathology were excluded. Heterogeneity was checked using an adapted version of the DerSimonian and Laird Q test, and quantified by calculating the proportion of the total variance due to between-study variance (Ri statistic). Eight hundred and eleven single papers were reviewed and filtered according to the inclusion/exclusion criteria. Ten studies were selected (1647 patients and 2740 maxillary sinuses (study unit)). The pooled prevalence of PSAA was 62.02 (95% confidence interval (CI) 46.33-77.71). CBCT studies detected PSAA more frequently (78.12, 95% CI 61.25-94.98) than CT studies (51.19, 95% CI 42.33-60.05). Conventional CT revealed thicker arteries than CBCT. It is concluded that PSAA detection is more frequent when CBCT explorations are used. Additional comparative studies controlling for potential confounding factors are needed to ascertain the actual diagnostic value of radiographic explorations for assessing the PSAA prior to sinus floor elevation procedures. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  14. Using Computational and Mechanical Models to Study Animal Locomotion

    OpenAIRE

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locom...

  15. Factors affecting the adoption of cloud computing: an exploratory study

    OpenAIRE

    Morgan, Lorraine; Conboy, Kieran

    2013-01-01

    peer-reviewed While it is widely acknowledged that cloud computing has the potential to transform a large part of the IT industry, issues surrounding the adoption of cloud computing have received relatively little attention. Drawing on three case studies of service providers and their customers, this study will contribute to the existing cloud technologies literature that does not address the complex and multifaceted nature of adoption. The findings are analyzed using the adoption of innov...

  16. Decoding Computer Games: Studying “Special Operation 85”

    Directory of Open Access Journals (Sweden)

    Bahareh Jalalzadeh

    2009-11-01

    Full Text Available As other media, computer games convey messages which have tow features: explicit and implicit. Semiologically studying computer games and comparing them with narrative structures, the present study attempts to discover the messages they convey. Therefore we have studied and decoded “Special operation 85” as a semiological text. Results show that the game’s features, as naming, interests and motivations of the engaged people, and the events narrated, all lead the producers to their goals of introducing and publicizing Iranian-Islamic cultural values. Although this feature makes “Special Opreation 85” a unique game, it fails in its attempt to produce a mythical personage in Iranian-Islamic cultural context.

  17. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  18. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    Science.gov (United States)

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  19. Electromagnetic computation methods for lightning surge protection studies

    CERN Document Server

    Baba, Yoshihiro

    2016-01-01

    This book is the first to consolidate current research and to examine the theories of electromagnetic computation methods in relation to lightning surge protection. The authors introduce and compare existing electromagnetic computation methods such as the method of moments (MOM), the partial element equivalent circuit (PEEC), the finite element method (FEM), the transmission-line modeling (TLM) method, and the finite-difference time-domain (FDTD) method. The application of FDTD method to lightning protection studies is a topic that has matured through many practical applications in the past decade, and the authors explain the derivation of Maxwell's equations required by the FDTD, and modeling of various electrical components needed in computing lightning electromagnetic fields and surges with the FDTD method. The book describes the application of FDTD method to current and emerging problems of lightning surge protection of continuously more complex installations, particularly in critical infrastructures of e...

  20. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    Science.gov (United States)

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  1. Computational foundations of the visual number sense.

    Science.gov (United States)

    Stoianov, Ivilin Peev; Zorzi, Marco

    2017-01-01

    We provide an emergentist perspective on the computational mechanism underlying numerosity perception, its development, and the role of inhibition, based on our deep neural network model. We argue that the influence of continuous visual properties does not challenge the notion of number sense, but reveals limit conditions for the computation that yields invariance in numerosity perception. Alternative accounts should be formalized in a computational model.

  2. Experimental and computational development of a natural breast phantom for dosimetry studies

    International Nuclear Information System (INIS)

    Nogueira, Luciana B.; Campos, Tarcisio P.R.

    2013-01-01

    This paper describes the experimental and computational development of a natural breast phantom, anthropomorphic and anthropometric for studies in dosimetry of brachytherapy and teletherapy of breast. The natural breast phantom developed corresponding to fibroadipose breasts of women aged 30 to 50 years, presenting radiographically medium density. The experimental breast phantom was constituted of three tissue-equivalents (TE's): glandular TE, adipose TE and skin TE. These TE's were developed according to chemical composition of human breast and present radiological response to exposure. Completed the construction of experimental breast phantom this was mounted on a thorax phantom previously developed by the research group NRI/UFMG. Then the computational breast phantom was constructed by performing a computed tomography (CT) by axial slices of the chest phantom. Through the images generated by CT a computational model of voxels of the thorax phantom was developed by SISCODES computational program, being the computational breast phantom represented by the same TE's of the experimental breast phantom. The images generated by CT allowed evaluating the radiological equivalence of the tissues. The breast phantom is being used in studies of experimental dosimetry both in brachytherapy as in teletherapy of breast. Dosimetry studies by MCNP-5 code using the computational model of the phantom breast are in progress. (author)

  3. Defragging Computer/Videogame Implementation and Assessment in the Social Studies

    Science.gov (United States)

    McBride, Holly

    2014-01-01

    Students in this post-industrial technological age require opportunities for the acquisition of new skills, especially in the marketplace of innovation. A pedagogical strategy that is becoming more and more popular within social studies classrooms is the use of computer and video games as enhancements to everyday lesson plans. Computer/video games…

  4. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  5. Adaptation to High Ethanol Reveals Complex Evolutionary Pathways.

    Directory of Open Access Journals (Sweden)

    Karin Voordeckers

    2015-11-01

    Full Text Available Tolerance to high levels of ethanol is an ecologically and industrially relevant phenotype of microbes, but the molecular mechanisms underlying this complex trait remain largely unknown. Here, we use long-term experimental evolution of isogenic yeast populations of different initial ploidy to study adaptation to increasing levels of ethanol. Whole-genome sequencing of more than 30 evolved populations and over 100 adapted clones isolated throughout this two-year evolution experiment revealed how a complex interplay of de novo single nucleotide mutations, copy number variation, ploidy changes, mutator phenotypes, and clonal interference led to a significant increase in ethanol tolerance. Although the specific mutations differ between different evolved lineages, application of a novel computational pipeline, PheNetic, revealed that many mutations target functional modules involved in stress response, cell cycle regulation, DNA repair and respiration. Measuring the fitness effects of selected mutations introduced in non-evolved ethanol-sensitive cells revealed several adaptive mutations that had previously not been implicated in ethanol tolerance, including mutations in PRT1, VPS70 and MEX67. Interestingly, variation in VPS70 was recently identified as a QTL for ethanol tolerance in an industrial bio-ethanol strain. Taken together, our results show how, in contrast to adaptation to some other stresses, adaptation to a continuous complex and severe stress involves interplay of different evolutionary mechanisms. In addition, our study reveals functional modules involved in ethanol resistance and identifies several mutations that could help to improve the ethanol tolerance of industrial yeasts.

  6. TEACHERS’ COMPUTER SELF-EFFICACY AND THEIR USE OF EDUCATIONAL TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vehbi TUREL

    2014-10-01

    Full Text Available This study examined the use of educational technology by primary and subject teachers (i.e. secondary and high school teachers in a small town in the eastern part of Turkey in the spring of 2012. The study examined the primary, secondary and high school teachers’ Ø personal and computer related (demographic characteristics, Ø their computer self-efficacy perceptions, Ø their computer-using level in certain software, Ø their frequency of computer use for teaching, administrative and communication objectives, and Ø their use of educational technology preferences for preparation and teaching purposes. In this study, all primary, secondary and high school teachers in the small town were given the questionnaires to complete. 158 teachers (n=158 completed and returned them. The study was mostly quantitative and partly qualitative. The quantitative results were analysed with SPSS (i.e. mean, Std. Deviation, frequency, percentage, ANOVA. The qualitative data were analysed with examining the participants’ responses gathered from the open-ended questions and focussing on the shared themes among the responses. The results reveal that the teachers think that they have good computer self-efficacy perceptions, their level in certain programs is good, and they often use computers for a wide range of purposes. There are also statistical differences between; Ø their computer self-efficacy perceptions, Ø frequency of computer use for certain purposes, and Ø computer level in certain programs in terms of different independent variables.

  7. Recurrent largngeal nerve paralysis: a laryngographic and computed tomographic study

    International Nuclear Information System (INIS)

    Agha, F.P.

    1983-01-01

    Vocal cord paralysis is a relatively common entity, usually resulting from a pathologic process of the vagus nerve or its recurrent larynegeal branch. It is rarely caused by intralargngeal lesions. Four teen patients with recurrent laryngeal nerve paralysis (RLNP) were evaluated by laryngography, computed tomography (CT), or both. In the evaluation of the paramedian cord, CT was limited in its ability to differentiate between tumor or RLNP as the cause of the fixed cord, but it yielded more information than laryngography on the structural abnormalities of the larynx and pre-epiglottic and paralaryngeal spaces. Laryngography revealed distinct features of RLNP and is the procedure of choice for evaluation of functional abnormalities of the larynx until further experience with faster CT scanners and dynamic scanning of the larynx is gained

  8. Prospective pilot study of a tablet computer in an Emergency Department.

    Science.gov (United States)

    Horng, Steven; Goss, Foster R; Chen, Richard S; Nathanson, Larry A

    2012-05-01

    The recent availability of low-cost tablet computers can facilitate bedside information retrieval by clinicians. To evaluate the effect of physician tablet use in the Emergency Department. Prospective cohort study comparing physician workstation usage with and without a tablet. 55,000 visits/year Level 1 Emergency Department at a tertiary academic teaching hospital. 13 emergency physicians (7 Attendings, 4 EM3s, and 2 EM1s) worked a total of 168 scheduled shifts (130 without and 38 with tablets) during the study period. Physician use of a tablet computer while delivering direct patient care in the Emergency Department. The primary outcome measure was the time spent using the Emergency Department Information System (EDIS) at a computer workstation per shift. The secondary outcome measure was the number of EDIS logins at a computer workstation per shift. Clinician use of a tablet was associated with a 38min (17-59) decrease in time spent per shift using the EDIS at a computer workstation (pcomputer was associated with a reduction in the number of times physicians logged into a computer workstation and a reduction in the amount of time they spent there using the EDIS. The presumed benefit is that decreasing time at a computer workstation increases physician availability at the bedside. However, this association will require further investigation. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. A study of Computing doctorates in South Africa from 1978 to 2014

    Directory of Open Access Journals (Sweden)

    Ian D Sanders

    2015-12-01

    Full Text Available This paper studies the output of South African universities in terms of computing-related doctorates in order to determine trends in numbers of doctorates awarded and to identify strong doctoral study research areas. Data collected from a variety of sources relating to Computing doctorates conferred since the late 1970s was used to compare the situation in Computing with that of all doctorates. The number of Computing doctorates awarded has increased considerably over the period of study. Nearly three times as many doctorates were awarded in the period 2010–2014 as in 2000–2004. The universities producing the most Computing doctorates were either previously “traditional” universities or comprehensive universities formed by amalgamating a traditional research university with a technikon. Universities of technology have not yet produced many doctorates as they do not have a strong research tradition. The analysis of topic keywords using ACM Computing classifications is preliminary but shows that professional issues are dominant in Information Systems, models are often built in Computer Science and several topics, including computing in education, are evident in both IS and CS. The relevant data is in the public domain but access is difficult as record keeping was generally inconsistent and incomplete. In addition, electronic databases at universities are not easily searchable and access to HEMIS data is limited. The database built for this paper is more inclusive in terms of discipline-related data than others.

  10. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    Science.gov (United States)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  11. A computational study of high entropy alloys

    Science.gov (United States)

    Wang, Yang; Gao, Michael; Widom, Michael; Hawk, Jeff

    2013-03-01

    As a new class of advanced materials, high-entropy alloys (HEAs) exhibit a wide variety of excellent materials properties, including high strength, reasonable ductility with appreciable work-hardening, corrosion and oxidation resistance, wear resistance, and outstanding diffusion-barrier performance, especially at elevated and high temperatures. In this talk, we will explain our computational approach to the study of HEAs that employs the Korringa-Kohn-Rostoker coherent potential approximation (KKR-CPA) method. The KKR-CPA method uses Green's function technique within the framework of multiple scattering theory and is uniquely designed for the theoretical investigation of random alloys from the first principles. The application of the KKR-CPA method will be discussed as it pertains to the study of structural and mechanical properties of HEAs. In particular, computational results will be presented for AlxCoCrCuFeNi (x = 0, 0.3, 0.5, 0.8, 1.0, 1.3, 2.0, 2.8, and 3.0), and these results will be compared with experimental information from the literature.

  12. A comparative study: use of a Brain-computer Interface (BCI) device by people with cerebral palsy in interaction with computers.

    Science.gov (United States)

    Heidrich, Regina O; Jensen, Emely; Rebelo, Francisco; Oliveira, Tiago

    2015-01-01

    This article presents a comparative study among people with cerebral palsy and healthy controls, of various ages, using a Brain-computer Interface (BCI) device. The research is qualitative in its approach. Researchers worked with Observational Case Studies. People with cerebral palsy and healthy controls were evaluated in Portugal and in Brazil. The study aimed to develop a study for product evaluation in order to perceive whether people with cerebral palsy could interact with the computer and compare whether their performance is similar to that of healthy controls when using the Brain-computer Interface. Ultimately, it was found that there are no significant differences between people with cerebral palsy in the two countries, as well as between populations without cerebral palsy (healthy controls).

  13. Characteristic findings of computed tomography in cerebral metastatic malignant melanomas

    International Nuclear Information System (INIS)

    Kukita, Chikashige; Nose, Tadao; Nakagawa, Kunio; Tomono, Yuji; Enomoto, Takao; Hashikawa, Masanori; Egashira, Taihei; Maki, Yutaka

    1986-01-01

    Four cases with metastatic cerebral melanoma were studied by means of computed tomography (CT). Two cases were male, and the other two were female, with an average age of 55 years. Their primary lesions were on the chest wall in two cases, around the calcaneus in one, and around the genitalia in one. All cases died within 6 months after the metastatic brain lesions were found. Necropsies were carried out in two cases. CT revealed high-density areas in all cases, and contrast studies showed an enhancement of the lesions, as has previously been reported. On the other hand, autopsied cases revealed neither fresh nor old intratumoral bleedings such as a scattered focus of hemosiderin. These findings suggest that the high-density tumoral shadows in CT are probably not intratumoral bleedings due to a bleeding tendency of the tumors, as some authors have previously supposed. We mentioned some other factors contributing to the high density of the melanoma on computed tomograms. (author)

  14. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems.

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach's Youth Self-Report (YSR). The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students' place of living and their parents' job, and using computer games. Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents.

  15. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  16. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in modern High Energy Physics. To perform precision measurements of the Higgs boson properties, the use of fast and efficient instruments of Monte Carlo event simulation is required. Due to the increasing amount of data and to the growing complexity of the simulation software tools, the computing resources currently available for Monte Carlo simulation on the LHC GRID are not sufficient. One of the possibilities to address this shortfall of computing resources is the usage of institutes computer clusters, commercial computing resources and supercomputers. In this paper, a brief description of the Higgs boson physics, the Monte-Carlo generation and event simulation techniques are presented. A description of modern high performance computing systems and tests of their performance are also discussed. These studies have been performed on the Worldwide LHC Computing Grid and Kurchatov Institute Data Processing Center, including Tier...

  17. Tracing monadic computations and representing effects

    Directory of Open Access Journals (Sweden)

    Maciej Piróg

    2012-02-01

    Full Text Available In functional programming, monads are supposed to encapsulate computations, effectfully producing the final result, but keeping to themselves the means of acquiring it. For various reasons, we sometimes want to reveal the internals of a computation. To make that possible, in this paper we introduce monad transformers that add the ability to automatically accumulate observations about the course of execution as an effect. We discover that if we treat the resulting trace as the actual result of the computation, we can find new functionality in existing monads, notably when working with non-terminating computations.

  18. An Exploratory Study of Pauses in Computer-Assisted EFL Writing

    Science.gov (United States)

    Xu, Cuiqin; Ding, Yanren

    2014-01-01

    The advance of computer input log and screen-recording programs over the last two decades has greatly facilitated research into the writing process in real time. Using Inputlog 4.0 and Camtasia 6.0 to record the writing process of 24 Chinese EFL writers in an argumentative task, this study explored L2 writers' pausing patterns in computer-assisted…

  19. Musculoskeletal Problems Associated with University Students Computer Users: A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Rakhadani PB

    2017-07-01

    Full Text Available While several studies have examined the prevalence and correlates of musculoskeletal problems among university students, scanty information exists in South African context. The objective of this study was to determine the prevalence, causes and consequences of musculoskeletal problems among University of Venda students’ computer users. This cross-sectional study involved 694 university students at the University of Venda. A self-designed questionnaire was used to collect information on the sociodemographic characteristics, problems associated with computer users, and causes of musculoskeletal problems associated with computer users. The majority (84.6% of the participants use computer for internet, wording processing (20.3%, and games (18.7%. The students reported neck pain when using computer (52.3%; shoulder (47.0%, finger (45.0%, lower back (43.1%, general body pain (42.9%, elbow (36.2%, wrist (33.7%, hip and foot (29.1% and knee (26.2%. Reported causes of musculoskeletal pains associated with computer usage were: sitting position, low chair, a lot of time spent on computer, uncomfortable laboratory chairs, and stressfulness. Eye problems (51.9%, muscle cramp (344.0%, headache (45.3%, blurred vision (38.0%, feeling of illness (39.9% and missed lectures (29.1% were consequences of musculoskeletal problems linked to computer use. The majority of students reported having mild pain (43.7%, moderate (24.2%, and severe (8.4% pains. Years of computer use were significantly associated with neck, shoulder and wrist pain. Using computer for internet was significantly associated with neck pain (OR=0.60; 95% CI 0.40-0.93; games: neck (OR=0.60; 95% CI 0.40-0.85 and hip/foot (OR=0.60; CI 95% 0.40-0.92, programming for elbow (OR= 1.78; CI 95% 1.10-2.94 and wrist (OR=2.25; CI 95% 1.36-3.73, while word processing was significantly associated with lower back (OR=1.45; CI 95% 1.03-2.04. Undergraduate study had a significant association with elbow pain (OR=2

  20. Computer use and addiction in Romanian children and teenagers--an observational study.

    Science.gov (United States)

    Chiriţă, V; Chiriţă, Roxana; Stefănescu, C; Chele, Gabriela; Ilinca, M

    2006-01-01

    The computer has provided some wonderful opportunities for our children. Although research on the effects of children's use of computer is still ambiguous, some initial indications of positive and negative effects are beginning t emerge. They commonly use computers for playing games, completing school assignments, email, and connecting to the Internet. This may sometimes come at the expense of other activities such as homework or normal social interchange. Although most children seem to naturally correct the problem, parents and educators must monitor the signs of misuse. Studies of general computer users suggest that some children's may experience psychological problems such as social isolation, depression, loneliness, and time mismanagement related to their computer use and failure at school. The purpose of this study is to investigate issues related to computer use by school students from 11 to 18 years old. The survey included a representative sample of 439 school students of ages 11 to 18. All of the students came from 3 gymnasium schools and 5 high schools of Iaşi, Romania. The students answered to a questionnaire comprising 34 questions related to computer activities. The children's parents answered to a second questionnaire with the same subject. Most questions supposed to rate on a scale the frequency of occurrence of a certain event or issue; some questions solicited an open-answer or to choose an answer from a list. These were aimed at highlighting: (1) The frequency of computer use by the students; (2) The interference of excessive use with school performance and social life; (3) The identification of a possible computer addiction. The data was processed using the SPSS statistics software, version 11.0. Results show that the school students prefer to spend a considerable amount of time with their computers, over 3 hours/day. More than 65.7% of the students have a computer at home. More than 70% of the parents admit they do not or only occasionally

  1. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  2. Studies in Mathematics, Volume 22. Studies in Computer Science.

    Science.gov (United States)

    Pollack, Seymour V., Ed.

    The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…

  3. Development of highly potent melanogenesis inhibitor by in vitro, in vivo and computational studies

    Directory of Open Access Journals (Sweden)

    Abbas Q

    2017-07-01

    Full Text Available Qamar Abbas,1 Zaman Ashraf,2 Mubashir Hassan,1 Humaira Nadeem,3 Muhammad Latif,4 Samina Afzal,5 Sung-Yum Seo1 1Department of Biology, College of Natural Sciences, Kongju National University, Gongju, Republic of Korea; 2Department of Chemistry, Allama Iqbal Open University, Islamabad, 3Riphah Institute of Pharmaceutical Sciences, Riphah International University, Islamabad, Pakistan; 4Center for Genetics and Inherited Diseases, Taibah University, Almadinah Almunawwarah, Kingdom of Saudi Arabia; 5Faculty of Pharmacy, Bahauddin Zakria University, Multan, Pakistan Abstract: The present work describes the synthesis of few hydroxylated amide derivatives as melanogenesis inhibitors. In vitro, in vivo and computational studies proved that compound 6d is a highly potent melanogenesis inhibitor compared to standard kojic acid. The title amides 4a–e and 6a–e were synthesized following simple reaction routes with excellent yields. Most of the synthesized compounds exhibited good mushroom tyrosinase inhibitory activity, but compound 6d showed excellent activity (IC50 0.15 µM compared to standard kojic acid (IC50 16.69 µM. Lineweaver–Burk plots were used for the determination of kinetic mechanism, and it was found that compounds 4c and 6d showed non-competitive inhibition while 6a and 6b showed mixed-type inhibition. The kinetic mechanism further revealed that compound 6d formed irreversible complex with the target enzyme tyrosinase. The Ki values determined for compounds 4c, 6a, 6b and 6d are 0.188, 0.84, 2.20 and 0.217 µM respectively. Results of human tyrosinase inhibitory activity in A375 human melanoma cells showed that compound 6d exhibited 91.9% inhibitory activity at a concentration of 50 µg/mL. In vivo cytotoxicity evaluation of compound 6d in zebrafish embryos showed that it is non-toxic to zebrafish. Melanin depigmentation assay performed in zebrafish indicated that compound 6d possessed greater potential in decreasing melanin contents

  4. Differences in prevalence of self-reported musculoskeletal symptoms among computer and non-computer users in a Nigerian population: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Ayanniyi O

    2010-08-01

    Full Text Available Abstract Background Literature abounds on the prevalent nature of Self Reported Musculoskeletal Symptoms (SRMS among computer users, but studies that actually compared this with non computer users are meagre thereby reducing the strength of the evidence. This study compared the prevalence of SRMS between computer and non computer users and assessed the risk factors associated with SRMS. Methods A total of 472 participants comprising equal numbers of age and sex matched computer and non computer users were assessed for the presence of SRMS. Information concerning musculoskeletal symptoms and discomforts from the neck, shoulders, upper back, elbows, wrists/hands, low back, hips/thighs, knees and ankles/feet were obtained using the Standardized Nordic questionnaire. Results The prevalence of SRMS was significantly higher in the computer users than the non computer users both over the past 7 days (χ2 = 39.11, p = 0.001 and during the past 12 month durations (χ2 = 53.56, p = 0.001. The odds of reporting musculoskeletal symptoms was least for participants above the age of 40 years (OR = 0.42, 95% CI = 0.31-0.64 over the past 7 days and OR = 0.61; 95% CI = 0.47-0.77 during the past 12 months and also reduced in female participants. Increasing daily hours and accumulated years of computer use and tasks of data processing and designs/graphics significantly (p Conclusion The prevalence of SRMS was significantly higher in the computer users than the non computer users and younger age, being male, working longer hours daily, increasing years of computer use, data entry tasks and computer designs/graphics were the significant risk factors for reporting musculoskeletal symptoms among the computer users. Computer use may explain the increase in prevalence of SRMS among the computer users.

  5. Computed tomography of the pancreas

    International Nuclear Information System (INIS)

    Kolmannskog, F.; Kolbenstvedt, A.; Aakhus, T.; Bergan, A.; Fausa, O.; Elgjo, K.

    1980-01-01

    The findings by computed tomography in 203 cases of suspected pancreatic tumours, pancreatitis or peripancreatic abnormalities were evaluated. The appearances of the normal and the diseased pancreas are described. Computed tomography is highly accurate in detecting pancreatic masses, but can not differentiate neoplastic from inflammatory disease. The only reliable signs of pancreatic carcinoma are a focal mass in the pancreas, together with liver metastasis. When a pancreatic mass is revealed by computed tomography, CT-guided fine-needle aspiration biopsy of the pancreas is recommended. Thus the need for more invasive diagnostic procedures and explorative laparotomy may be avoided in some patients. (Auth.)

  6. Computed tomography study of otitis media

    International Nuclear Information System (INIS)

    Bahia, Paulo Roberto Valle; Marchiori, Edson

    1997-01-01

    The findings of computed tomography (CT) of 89 patients clinically suspected of having otitis media were studied in this work. Such results were compared to clinical diagnosis, otoscopy, surgical findings and previous data. Among the results of our analysis, we studied seven patients with acute otitis media and 83 patients with chronic otitis media. The patients with acute otitis media have undergone CT examinations to evaluate possible spread to central nervous system. The diagnosis of cholesteatoma, its extension and complications were the main indication. for chronic otitis media study. The main findings of the cholesteatomatous otitis were the occupation of the epitympanun, the bony wall destruction and the ossicular chain erosion. The CT demonstrated a great sensibility to diagnose the cholesteatoma. (author)

  7. Conventional versus computer-navigated TKA: a prospective randomized study.

    Science.gov (United States)

    Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques

    2017-06-01

    The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant

  8. Studies to reveal the nature of interactions between catalase and curcumin using computational methods and optical techniques.

    Science.gov (United States)

    Mofidi Najjar, Fayezeh; Ghadari, Rahim; Yousefi, Reza; Safari, Naser; Sheikhhasani, Vahid; Sheibani, Nader; Moosavi-Movahedi, Ali Akbar

    2017-02-01

    Curcumin is an important antioxidant compound, and is widely reported as an effective component for reducing complications of many diseases. However, the detailed mechanisms of its activity remain poorly understood. We found that curcumin can significantly increase catalase activity of BLC (bovine liver catalase). The mechanism of curcumin action was investigated using a computational method. We suggested that curcumin may activate BLC by modifying the bottleneck of its narrow channel. The molecular dynamic simulation data showed that placing curcumin on the structure of enzyme can increase the size of the bottleneck in the narrow channel of BLC, and readily allow the access of substrate to the active site. Because of the increase of the distance between amino acids of the bottleneck in the presence of curcumin, the entrance space of substrate increased from 250Å 3 to 440Å 3 . In addition, the increase in emission of intrinsic fluorescence of BLC in presence of curcumin demonstrated changes in tertiary structure of catalase, and possibility of less quenching. We also used circular dichroism (CD) spectropolarimetry to determine how curcumin may alter the enzyme secondary structure. Catalase spectra in the presence of various concentrations of curcumin showed an increase in the amount of α-helix content. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Computational and experimental study of the cluster size distribution in MAPLE

    International Nuclear Information System (INIS)

    Leveugle, Elodie; Zhigilei, Leonid V.; Sellinger, Aaron; Fitz-Gerald, James M.

    2007-01-01

    A combined experimental and computational study is performed to investigate the origin and characteristics of the surface features observed in SEM images of thin polymer films deposited in matrix-assisted pulsed laser evaporation (MAPLE). Analysis of high-resolution SEM images of surface morphologies of the films deposited at different fluences reveals that the mass distributions of the surface features can be well described by a power-law, Y(N) ∝ N -t , with exponent -t ∼ -1.6. Molecular dynamic simulations of the MAPLE process predict a similar size distribution for large clusters observed in the ablation plume. A weak dependence of the cluster size distributions on fluence and target composition suggests that the power-law cluster size distribution may be a general characteristic of the ablation plume generated as a result of an explosive decomposition of a target region overheated above the limit of its thermodynamic stability. Based on the simulation results, we suggest that the ejection of large matrix-polymer clusters, followed by evaporation of the volatile matrix, is responsible for the formation of the surface features observed in the polymer films deposited in MAPLE experiments

  10. Reliability and Availability of Cloud Computing

    CERN Document Server

    Bauer, Eric

    2012-01-01

    A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le

  11. Helicopter fuselage drag - combined computational fluid dynamics and experimental studies

    Science.gov (United States)

    Batrakov, A.; Kusyumov, A.; Mikhailov, S.; Pakhov, V.; Sungatullin, A.; Valeev, M.; Zherekhov, V.; Barakos, G.

    2015-06-01

    In this paper, wind tunnel experiments are combined with Computational Fluid Dynamics (CFD) aiming to analyze the aerodynamics of realistic fuselage configurations. A development model of the ANSAT aircraft and an early model of the AKTAI light helicopter were employed. Both models were tested at the subsonic wind tunnel of KNRTU-KAI for a range of Reynolds numbers and pitch and yaw angles. The force balance measurements were complemented by particle image velocimetry (PIV) investigations for the cases where the experimental force measurements showed substantial unsteadiness. The CFD results were found to be in fair agreement with the test data and revealed some flow separation at the rear of the fuselages. Once confidence on the CFD method was established, further modifications were introduced to the ANSAT-like fuselage model to demonstrate drag reduction via small shape changes.

  12. Computational models reveal a passive mechanism for cell migration in the crypt.

    Directory of Open Access Journals (Sweden)

    Sara-Jane Dunn

    Full Text Available Cell migration in the intestinal crypt is essential for the regular renewal of the epithelium, and the continued upward movement of cells is a key characteristic of healthy crypt dynamics. However, the driving force behind this migration is unknown. Possibilities include mitotic pressure, active movement driven by motility cues, or negative pressure arising from cell loss at the crypt collar. It is possible that a combination of factors together coordinate migration. Here, three different computational models are used to provide insight into the mechanisms that underpin cell movement in the crypt, by examining the consequence of eliminating cell division on cell movement. Computational simulations agree with existing experimental results, confirming that migration can continue in the absence of mitosis. Importantly, however, simulations allow us to infer mechanisms that are sufficient to generate cell movement, which is not possible through experimental observation alone. The results produced by the three models agree and suggest that cell loss due to apoptosis and extrusion at the crypt collar relieves cell compression below, allowing cells to expand and move upwards. This finding suggests that future experiments should focus on the role of apoptosis and cell extrusion in controlling cell migration in the crypt.

  13. Mirror neurons and imitation: a computationally guided review.

    Science.gov (United States)

    Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael

    2006-04-01

    Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.

  14. A parametric study of a solar calcinator using computational fluid dynamics

    International Nuclear Information System (INIS)

    Fidaros, D.K.; Baxevanou, C.A.; Vlachos, N.S.

    2007-01-01

    In this work a horizontal rotating solar calcinator is studied numerically using computational fluid dynamics. The specific solar reactor is a 10 kW model designed and used for efficiency studies. The numerical model is based on the solution of the Navier-Stokes equations for the gas flow, and on Lagrangean dynamics for the discrete particles. All necessary mathematical models were developed and incorporated into a computational fluid dynamics model with the influence of turbulence simulated by a two-equation (RNG k-ε) model. The efficiency of the reactor was calculated for different thermal inputs, feed rates, rotational speeds and particle diameters. The numerically computed degrees of calcination compared well with equivalent experimental results

  15. Multi-level computational chemistry study on hydrogen recombination catalyst of off-gas treatment system

    International Nuclear Information System (INIS)

    Hatakeyama, Nozomu; Ise, Mariko; Inaba, Kenji

    2011-01-01

    In order to reveal the deactivation mechanism of the hydrogen recombination catalyst of off-gas treatment system, we investigate by using multi-level computational chemistry simulation methods. The recombiner apparatus is modeled by the numerical mesh system in the axial coordinates, and unsteady, advection and reaction rate equations are solved by using a finite difference method. The chemical reactions are formulated to represent adsorption-desorption of hydrogen and oxygen on Pt catalyst, and time developments of the coverage factors of Pt are solved numerically. The computational simulations successfully reproduce the very similar behaviors observed by experiments, such as increasing of the inversion rates of H 2 to H 2 O, the temperatures distributions along the flow direction, dependencies of experimental condition, and so on. Thus Pt poisoning is considered to cause the deactivation of the hydrogen recombination catalyst. To clarify the poisoning mechanism, the molecular level simulation is applied to the system of Pt on boehmite attacked by a cyclic siloxane which has been detected by experiments and considered as one of poisoning spices. The simulation shows ring-opening reaction of the cyclic siloxane on Pt, then attachment of two ends of the chain-like siloxane to Pt and boehmite, respectively, and that finally the recombination reaction is prevented. This may be the first study to find out the detailed dynamical mechanism of hydrogen recombination catalyst poisoning with cyclic siloxane. (author)

  16. Computational Study and Analysis of Structural Imperfections in 1D and 2D Photonic Crystals

    Energy Technology Data Exchange (ETDEWEB)

    Maskaly, Karlene Rosera [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2005-06-01

    increasing RMS roughness. Again, the homogenization approximation is able to predict these results. The problem of surface scratches on 1D photonic crystals is also addressed. Although the reflectivity decreases are lower in this study, up to a 15% change in reflectivity is observed in certain scratched photonic crystal structures. However, this reflectivity change can be significantly decreased by adding a low index protective coating to the surface of the photonic crystal. Again, application of homogenization theory to these structures confirms its predictive power for this type of imperfection as well. Additionally, the problem of a circular pores in 2D photonic crystals is investigated, showing that almost a 50% change in reflectivity can occur for some structures. Furthermore, this study reveals trends that are consistent with the 1D simulations: parameter changes that increase the absolute reflectivity of the photonic crystal will also increase its tolerance to structural imperfections. Finally, experimental reflectance spectra from roughened 1D photonic crystals are compared to the results predicted computationally in this thesis. Both the computed and experimental spectra correlate favorably, validating the findings presented herein.

  17. Influence of the Pixel Sizes of Reference Computed Tomography on Single-photon Emission Computed Tomography Image Reconstruction Using Conjugate-gradient Algorithm.

    Science.gov (United States)

    Okuda, Kyohei; Sakimoto, Shota; Fujii, Susumu; Ida, Tomonobu; Moriyama, Shigeru

    The frame-of-reference using computed-tomography (CT) coordinate system on single-photon emission computed tomography (SPECT) reconstruction is one of the advanced characteristics of the xSPECT reconstruction system. The aim of this study was to reveal the influence of the high-resolution frame-of-reference on the xSPECT reconstruction. 99m Tc line-source phantom and National Electrical Manufacturers Association (NEMA) image quality phantom were scanned using the SPECT/CT system. xSPECT reconstructions were performed with the reference CT images in different sizes of the display field-of-view (DFOV) and pixel. The pixel sizes of the reconstructed xSPECT images were close to 2.4 mm, which is acquired as originally projection data, even if the reference CT resolution was varied. The full width at half maximum (FWHM) of the line-source, absolute recovery coefficient, and background variability of image quality phantom were independent on the sizes of DFOV in the reference CT images. The results of this study revealed that the image quality of the reconstructed xSPECT images is not influenced by the resolution of frame-of-reference on SPECT reconstruction.

  18. Using Volunteer Computing to Study Some Features of Diagonal Latin Squares

    Science.gov (United States)

    Vatutin, Eduard; Zaikin, Oleg; Kochemazov, Stepan; Valyaev, Sergey

    2017-12-01

    In this research, the study concerns around several features of diagonal Latin squares (DLSs) of small order. Authors of the study suggest an algorithm for computing minimal and maximal numbers of transversals of DLSs. According to this algorithm, all DLSs of a particular order are generated, and for each square all its transversals and diagonal transversals are constructed. The algorithm was implemented and applied to DLSs of order at most 7 on a personal computer. The experiment for order 8 was performed in the volunteer computing project Gerasim@home. In addition, the problem of finding pairs of orthogonal DLSs of order 10 was considered and reduced to Boolean satisfiability problem. The obtained problem turned out to be very hard, therefore it was decomposed into a family of subproblems. In order to solve the problem, the volunteer computing project SAT@home was used. As a result, several dozen pairs of described kind were found.

  19. Vertebral Pneumaticity in the Ornithomimosaur Archaeornithomimus (Dinosauria: Theropoda Revealed by Computed Tomography Imaging and Reappraisal of Axial Pneumaticity in Ornithomimosauria.

    Directory of Open Access Journals (Sweden)

    Akinobu Watanabe

    Full Text Available Among extant vertebrates, pneumatization of postcranial bones is unique to birds, with few known exceptions in other groups. Through reduction in bone mass, this feature is thought to benefit flight capacity in modern birds, but its prevalence in non-avian dinosaurs of variable sizes has generated competing hypotheses on the initial adaptive significance of postcranial pneumaticity. To better understand the evolutionary history of postcranial pneumaticity, studies have surveyed its distribution among non-avian dinosaurs. Nevertheless, the degree of pneumaticity in the basal coelurosaurian group Ornithomimosauria remains poorly known, despite their potential to greatly enhance our understanding of the early evolution of pneumatic bones along the lineage leading to birds. Historically, the identification of postcranial pneumaticity in non-avian dinosaurs has been based on examination of external morphology, and few studies thus far have focused on the internal architecture of pneumatic structures inside the bones. Here, we describe the vertebral pneumaticity of the ornithomimosaur Archaeornithomimus with the aid of X-ray computed tomography (CT imaging. Complementary examination of external and internal osteology reveals (1 highly pneumatized cervical vertebrae with an elaborate configuration of interconnected chambers within the neural arch and the centrum; (2 anterior dorsal vertebrae with pneumatic chambers inside the neural arch; (3 apneumatic sacral vertebrae; and (4 a subset of proximal caudal vertebrae with limited pneumatic invasion into the neural arch. Comparisons with other theropod dinosaurs suggest that ornithomimosaurs primitively exhibited a plesiomorphic theropod condition for axial pneumaticity that was extended among later taxa, such as Archaeornithomimus and large bodied Deinocheirus. This finding corroborates the notion that evolutionary increases in vertebral pneumaticity occurred in parallel among independent lineages of bird

  20. Mobile Learning According to Students of Computer Engineering and Computer Education: A Comparison of Attitudes

    Directory of Open Access Journals (Sweden)

    Deniz Mertkan GEZGIN

    2018-01-01

    Full Text Available Mobile learning has started to perform an increasingly significant role in improving learning outcomes in education. Successful and efficient implementation of m-learning in higher education, as with all educational levels, depends on users’ acceptance of this technology. This study focuses on investigating the attitudes of undergraduate students of Computer Engineering (CENG and Computer Education and Instructional Technology (CEIT departments in a Turkish public university towards m-learning from three perspectives; gender, area of study, and mobile device ownership. Using a correlational survey method, a Mobile Learning Attitude Scale (MLAS was administered to 531 students, analysis of which revealed a positive attitude to m-learning in general. A further investigation of the aforementioned three variables showed a more positive attitude for female students in terms of usability, for CEIT students in terms of advantages, usability and independence, and for those owning a mobile device in terms of usability. An important implication from the findings, among others, is supplementing Computer Engineering curriculum with elective courses on the fundamentals of mobile learning, and/or the design and development of m-learning software, so as to create, in the long run, more specialized and complementary teams comprised of trained CENG and CEIT graduates in m-learning sector.

  1. A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class

    Science.gov (United States)

    Yuen, Timothy T.; Robbins, Kay A.

    2014-01-01

    Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…

  2. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    Science.gov (United States)

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Computational studies on energetic properties of nitrogen-rich ...

    Indian Academy of Sciences (India)

    Computational studies on energetic properties of nitrogen-rich energetic materials with ditetrazoles. LI XIAO-HONGa,b,∗ and ZHANG RUI-ZHOUa. aCollege of Physics and Engineering, Henan University of Science and Technology, Luoyang 471 003, China. bLuoyang Key Laboratory of Photoelectric Functional Materials, ...

  4. Understanding initial undergraduate expectations and identity in computing studies

    Science.gov (United States)

    Kinnunen, Päivi; Butler, Matthew; Morgan, Michael; Nylen, Aletta; Peters, Anne-Kathrin; Sinclair, Jane; Kalvala, Sara; Pesonen, Erkki

    2018-03-01

    There is growing appreciation of the importance of understanding the student perspective in Higher Education (HE) at both institutional and international levels. This is particularly important in Science, Technology, Engineering and Mathematics subjects such as Computer Science (CS) and Engineering in which industry needs are high but so are student dropout rates. An important factor to consider is the management of students' initial expectations of university study and career. This paper reports on a study of CS first-year students' expectations across three European countries using qualitative data from student surveys and essays. Expectation is examined from both short-term (topics to be studied) and long-term (career goals) perspectives. Tackling these issues will help paint a picture of computing education through students' eyes and explore their vision of its and their role in society. It will also help educators prepare students more effectively for university study and to improve the student experience.

  5. Availability and Overlap of Quality Computer Science Journal Holdings in Selected University Libraries in Malaysia

    OpenAIRE

    Zainab, A.N.; Ng, S.L.

    2003-01-01

    The study reveals the availability status of quality journals in the field of computer science held in the libraries of the University of Malaya, (UM), University of Science Malaysia (USM), University of Technology Malaysia (UTM), National University of Malaysia (UKM) and University Putra Malaysia (UPM). These universities are selected since they offer degree programmes in computer science. The study also investigates the degree of overlaps and unique titles in the five libraries. The Univers...

  6. Quantitative proteomic study of Aspergillus Fumigatus secretome revealed deamidation of secretory enzymes.

    Science.gov (United States)

    Adav, Sunil S; Ravindran, Anita; Sze, Siu Kwan

    2015-04-24

    Aspergillus sp. plays an essential role in lignocellulosic biomass recycling and is also exploited as cell factories for the production of industrial enzymes. This study profiled the secretome of Aspergillus fumigatus when grown with cellulose, xylan and starch by high throughput quantitative proteomics using isobaric tags for relative and absolute quantification (iTRAQ). Post translational modifications (PTMs) of proteins play a critical role in protein functions. However, our understanding of the PTMs in secretory proteins is limited. Here, we present the identification of PTMs such as deamidation of secreted proteins of A. fumigatus. This study quantified diverse groups of extracellular secreted enzymes and their functional classification revealed cellulases and glycoside hydrolases (32.9%), amylases (0.9%), hemicellulases (16.2%), lignin degrading enzymes (8.1%), peptidases and proteases (11.7%), chitinases, lipases and phosphatases (7.6%), and proteins with unknown function (22.5%). The comparison of quantitative iTRAQ results revealed that cellulose and xylan stimulates expression of specific cellulases and hemicellulases, and their abundance level as a function of substrate. In-depth data analysis revealed deamidation as a major PTM of key cellulose hydrolyzing enzymes like endoglucanases, cellobiohydrolases and glucosidases. Hemicellulose degrading endo-1,4-beta-xylanase, monosidases, xylosidases, lignin degrading laccase, isoamyl alcohol oxidase and oxidoreductases were also found to be deamidated. The filamentous fungi play an essential role in lignocellulosic biomass recycling and fungal strains belonging to Aspergillus were also exploited as cell factories for the production of organic acids, pharmaceuticals, and industrially important enzymes. In this study, extracellular proteins secreted by thermophilic A. fumigatus when grown with cellulose, xylan and starch were profiled using isobaric tags for relative and absolute quantification (iTRAQ) by

  7. Study on cranial computed tomography in infants and children with central nervous system disorders, 2

    International Nuclear Information System (INIS)

    Kumanomidou, Yoshiaki

    1980-01-01

    110 patients with cerebral palsy were studied by cranial computed tomography (CT) and electroencephalography (EEG) and the following results were obtained: 1) Abnormal brain findings on CT were present in 69% of spastic quadriplegia type, in 75% of spastic hemiplegia type, in 23% of athetotic type and in 50% of mixed type. 2) Most patients with spastic quadriplegia revealed diffuse cerebral atrophy and patients with spastic hemiplegia mostly showed hemispherial cerebral atrophy at the contralateral side to the motor paralysis on CT. Most patients with athetotis revealed normal CT-findings, but a few indicated slight diffuse cerebral atrophy on CT. 3) The severer was mental retardation of the patients, the more frequent and severer were CT-abnormalities. 4) Patients with epileptic seizure showed CT-abnormalities more often than patients without the seizure. 5) There was a good correlation between the abnormality of background activities on EEG and that on CT, in which their laterality coincided in most cases. 6) Sides of seizure discharges on EEG were the same as those of CT-abnormalities in 1/3 to 1/2 of patients, but the localization of seizure discharges corresponded to that of CT-abnormalities only in 11% of the cases. (author)

  8. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    Science.gov (United States)

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  9. Computers and internet in dental education system of Kerala, South India: A multicentre study

    Directory of Open Access Journals (Sweden)

    Kanakath Harikumar

    2015-01-01

    Full Text Available Computers and internet have exerted a tremendous effect on dental education programs all over the world. A multicenter study was done to assess trends in computer and internet usage among the dental students and faculty members across the South Indian state, Kerala. A total of 347 subjects participated in the study. All participants were highly competent with the use of computers and internet. 72.3% of the study subjects preferred hard copy textbooks to PDF format books. 81.3% of the study subjects thought that internet was a useful adjunct to dental education. 73.8% of the study subjects opined that computers and internet could never be a replacement to conventional classroom teaching. Efforts should be made to provide greater infrastructure with regard to computers and internet such as Wi-Fi, free, unlimited internet access to all students and faculty members.

  10. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  11. Analysis of Climatic and Environmental Changes Using CLEARS Web-GIS Information-Computational System: Siberia Case Study

    Science.gov (United States)

    Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.

    2011-12-01

    Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total

  12. Acting without seeing: Eye movements reveal visual processing without awareness Miriam Spering & Marisa Carrasco

    Science.gov (United States)

    Spering, Miriam; Carrasco, Marisa

    2015-01-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. We review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movements. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. PMID:25765322

  13. Two Studies Examining Argumentation in Asynchronous Computer Mediated Communication

    Science.gov (United States)

    Joiner, Richard; Jones, Sarah; Doherty, John

    2008-01-01

    Asynchronous computer mediated communication (CMC) would seem to be an ideal medium for supporting development in student argumentation. This paper investigates this assumption through two studies. The first study compared asynchronous CMC with face-to-face discussions. The transactional and strategic level of the argumentation (i.e. measures of…

  14. Emerging Trends in Heart Valve Engineering: Part IV. Computational Modeling and Experimental Studies.

    Science.gov (United States)

    Kheradvar, Arash; Groves, Elliott M; Falahatpisheh, Ahmad; Mofrad, Mohammad K; Hamed Alavi, S; Tranquillo, Robert; Dasi, Lakshmi P; Simmons, Craig A; Jane Grande-Allen, K; Goergen, Craig J; Baaijens, Frank; Little, Stephen H; Canic, Suncica; Griffith, Boyce

    2015-10-01

    In this final portion of an extensive review of heart valve engineering, we focus on the computational methods and experimental studies related to heart valves. The discussion begins with a thorough review of computational modeling and the governing equations of fluid and structural interaction. We then move onto multiscale and disease specific modeling. Finally, advanced methods related to in vitro testing of the heart valves are reviewed. This section of the review series is intended to illustrate application of computational methods and experimental studies and their interrelation for studying heart valves.

  15. Pulmonary artery aneurysm in Bechcet's disease: helical computed tomography study

    International Nuclear Information System (INIS)

    Munoz, J.; Caballero, P.; Olivera, M. J.; Cajal, M. L.; Caniego, J. L.

    2000-01-01

    Behcet's disease is a vasculitis of unknown etiology that affects arteries and veins of different sizes and can be associated with pulmonary artery aneurysms. We report the case of a patient with Behcet's disease and a pulmonary artery aneurysm who was studied by means of plain chest X ray, helical computed tomography and pulmonary arteriography. Helical computed tomography is a reliable technique for the diagnosis and follow-up of these patients. (Author) 9 refs

  16. Student Engagement with Computer-Generated Feedback: A Case Study

    Science.gov (United States)

    Zhang, Zhe

    2017-01-01

    In order to benefit from feedback on their writing, students need to engage effectively with it. This article reports a case study on student engagement with computer-generated feedback, known as automated writing evaluation (AWE) feedback, in an EFL context. Differing from previous studies that explored commercially available AWE programs, this…

  17. Computing with memory for energy-efficient robust systems

    CERN Document Server

    Paul, Somnath

    2013-01-01

    This book analyzes energy and reliability as major challenges faced by designers of computing frameworks in the nanometer technology regime.  The authors describe the existing solutions to address these challenges and then reveal a new reconfigurable computing platform, which leverages high-density nanoscale memory for both data storage and computation to maximize the energy-efficiency and reliability. The energy and reliability benefits of this new paradigm are illustrated and the design challenges are discussed. Various hardware and software aspects of this exciting computing paradigm are de

  18. Using Computational and Mechanical Models to Study Animal Locomotion

    Science.gov (United States)

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  19. Doctors' experience with handheld computers in clinical practice: qualitative study.

    Science.gov (United States)

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-05-15

    To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.

  20. Students and Taxes: a Privacy-Preserving Study Using Secure Computation

    Directory of Open Access Journals (Sweden)

    Bogdanov Dan

    2016-07-01

    Full Text Available We describe the use of secure multi-party computation for performing a large-scale privacy-preserving statistical study on real government data. In 2015, statisticians from the Estonian Center of Applied Research (CentAR conducted a big data study to look for correlations between working during university studies and failing to graduate in time. The study was conducted by linking the database of individual tax payments from the Estonian Tax and Customs Board and the database of higher education events from the Ministry of Education and Research. Data collection, preparation and analysis were conducted using the Share-mind secure multi-party computation system that provided end-to-end cryptographic protection to the analysis. Using ten million tax records and half a million education records in the analysis, this is the largest cryptographically private statistical study ever conducted on real data.

  1. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: A cross-sectional study

    Science.gov (United States)

    2010-01-01

    Background The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Methods Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. Results To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability. PMID:20064250

  2. Computer vision syndrome: a study of knowledge and practices in university students.

    Science.gov (United States)

    Reddy, S C; Low, C K; Lim, Y P; Low, L L; Mardina, F; Nursaleha, M P

    2013-01-01

    Computer vision syndrome (CVS) is a condition in which a person experiences one or more of eye symptoms as a result of prolonged working on a computer. To determine the prevalence of CVS symptoms, knowledge and practices of computer use in students studying in different universities in Malaysia, and to evaluate the association of various factors in computer use with the occurrence of symptoms. In a cross sectional, questionnaire survey study, data was collected in college students regarding the demography, use of spectacles, duration of daily continuous use of computer, symptoms of CVS, preventive measures taken to reduce the symptoms, use of radiation filter on the computer screen, and lighting in the room. A total of 795 students, aged between 18 and 25 years, from five universities in Malaysia were surveyed. The prevalence of symptoms of CVS (one or more) was found to be 89.9%; the most disturbing symptom was headache (19.7%) followed by eye strain (16.4%). Students who used computer for more than 2 hours per day experienced significantly more symptoms of CVS (p=0.0001). Looking at far objects in-between the work was significantly (p=0.0008) associated with less frequency of CVS symptoms. The use of radiation filter on the screen (p=0.6777) did not help in reducing the CVS symptoms. Ninety percent of university students in Malaysia experienced symptoms related to CVS, which was seen more often in those who used computer for more than 2 hours continuously per day. © NEPjOPH.

  3. Excessive computer game playing: evidence for addiction and aggression?

    Science.gov (United States)

    Grüsser, S M; Thalemann, R; Griffiths, M D

    2007-04-01

    Computer games have become an ever-increasing part of many adolescents' day-to-day lives. Coupled with this phenomenon, reports of excessive gaming (computer game playing) denominated as "computer/video game addiction" have been discussed in the popular press as well as in recent scientific research. The aim of the present study was the investigation of the addictive potential of gaming as well as the relationship between excessive gaming and aggressive attitudes and behavior. A sample comprising of 7069 gamers answered two questionnaires online. Data revealed that 11.9% of participants (840 gamers) fulfilled diagnostic criteria of addiction concerning their gaming behavior, while there is only weak evidence for the assumption that aggressive behavior is interrelated with excessive gaming in general. Results of this study contribute to the assumption that also playing games without monetary reward meets criteria of addiction. Hence, an addictive potential of gaming should be taken into consideration regarding prevention and intervention.

  4. Iranian EFL Teachers' Sense of Professional Identity and their Computer Literacy

    Directory of Open Access Journals (Sweden)

    Toktam Abtahi

    2016-03-01

    Full Text Available This study examines Iranian EFL teachers’ sense of professional identity and their computer literacy. To these end, 718 EFL teachers from different cities in Iran filled out job satisfaction, occupational commitment, and computer literacy questionnaires. SPSS software was employed to summarize the collected data. Independent Sample t-test and Pearson Product-Moment Correlation were run to check the level of significance. For qualitative data collection, five open-ended questions were added to the end of the job satisfaction questionnaire. The obtained answers were categorized and the frequency for each category was calculated. The results revealed that computer literacy has a significant relation with continuance commitment, job satisfaction, and gender. The results further suggested that teacher computer literacy provided an encouraging base for their professional identity.

  5. How Computer Music Modeling and Retrieval Interface with Music-And-Meaning Studies

    DEFF Research Database (Denmark)

    Grund, Cynthia M.

    2007-01-01

      Inspired by the interest generated as the result of a panel discussion dealing with cross-disciplinarity and computer music modeling and retrieval (CMMR) at CMMR 2005 - "Play!" - in Pisa, a panel discussion on  current issues of a cross-disciplinary character has been organized for ICMC07/CMMR...... 2007. Eight panelists will be dealing with the two questions: a) What are current issues within music-and-meaning studies, the examination of which mandates development of new techniques within computer music modeling and retrieval?  and b) Do current techniques within computer music modeling...... and retrieval give rise to new questions within music-and-meaning studies?...

  6. CoCo the colorful history of Tandy's underdog computer

    CERN Document Server

    Pitre, Boisy G

    2013-01-01

    CoCo: The Colorful History of Tandy's Underdog Computer is the first book to document the complete history of the Tandy Color Computer (CoCo), a popular 8-bit PC series from the 1980s that competed against the era's biggest names, including the Apple II, IBM PC, and Commodore 64. The book takes you inside the interesting stories and people behind this unique, underdog computer.Both noted computer science and technology advocates, authors Pitre and Loguidice reveal the story of a pivotal period in the home computing revolution from the perspective of Tandy's CoCo. As these computers were sold i

  7. Comparative Analysis on the Utilization of Computers | Nkata ...

    African Journals Online (AJOL)

    The findings reveal among others that extent of usability of computers in the two universities had a significant difference. It was concluded that the level of computer utilization in UNIPORT is more than in the RUST. It was recommended that periodical, pre and post qualification seminars be organized for the 2 university ...

  8. Computed tomography in renal trauma

    International Nuclear Information System (INIS)

    Brueck, W.; Eisenberger, F.; Buck, J.

    1981-01-01

    In a group of 19 patients suffering from flank trauma and gross hematuria the diagnostic value of angiography was compared with that of computed tomography. The cases that underwent both tests were found to have the some diagnosis of rupture of the kidney. Typical CT-findings in kidney rupture are demonstrated. Whereas angiography presents an exact picture of the arterial system of the kidney, including its injures computed tomography reveals the extent of organ lesons by showing extra- and intrarenal hematomas. If surgery is planned angiography is still mandatory, whereby the indication is largely determined by the clinical findings. Computed tomography as a non-invasive method is equally suitable for follow-ups. (orig.) [de

  9. Synthesis, characterization and computational study of the newly synthetized sulfonamide molecule

    Science.gov (United States)

    Murthy, P. Krishna; Suneetha, V.; Armaković, Stevan; Armaković, Sanja J.; Suchetan, P. A.; Giri, L.; Rao, R. Sreenivasa

    2018-02-01

    A new compound N-(2,5-dimethyl-4-nitrophenyl)-4-methylbenzenesulfonamide (NDMPMBS) has been derived from 2,5-dimethyl-4-nitroaniline and 4-methylbenzene-1-sulfonyl chloride. Structure was characterized by SCXRD studies and spectroscopic tools. Compound crystallized in the monoclinic crystal system with P21/c space group a = 10.0549, b = 18.967, c = 8.3087, β = 103.18 and Z = 4. Type and nature of intermolecular interaction in crystal state investigated by 3D-Hirshfeld surface and 2D-finger print plots revealed that title compound stabilized by several interactions. The structural and electronic properties of title compound have been calculated at DFT/B3LYP/6-311G++(d,p) level of theory. Computationally obtained spectral data was compared with experimental results, showing excellent mutual agreement. Assignment of each vibrational wave number was done on the basis of potential energy distribution (PED). Investigation of local reactivity descriptors encompassed visualization of molecular electrostatic potential (MEP) and average local ionization energy (ALIE) surfaces, visualization of Fukui functions, natural bond order (NBO) analysis, bond dissociation energies for hydrogen abstraction (H-BDE) and radial distribution functions (RDF) after molecular dynamics (MD) simulations. MD simulations were also used in order to investigate interaction of NDMPMBS molecule with 1WKR and 3ETT proteins protein.

  10. Studies on reversibility of cerebral atrophy in alcoholics by computed tomography

    International Nuclear Information System (INIS)

    Nakamura, Kiyoshi; Kikuchi, Yoshito; Sanga, Kenji; Nakamura, Kazuyoshi; Kawamura, Toshiaki; Domon, Yuji; Mitamura, Akira; Hayashi, Yu; Ogata, Motoyo.

    1987-01-01

    Only sparse data exist concerning objectively quantified reversibility of cerebral atrophy (CA) in alcoholics by computed tomography (CT). This study explored reversible CA changes from measurements of the area seen on repeated CT scans, which were acquired over a period of 6 - 191 weeks in 44 alcoholics. In the group of abstinent alcoholics, significant recovery of CA was observed in all 6 regions of interest (ROI) at the second CT scan, as compared with that at the first CT scan. Measurements obtained from CT revealed aggravation of CA in the bilateral anterior horns, right cella media, and frontal subarachnoid space in the group with drinking alcoholics. The rate of recovery was significantly higher with the interval between the first and second CT scans and the duration of abstinence being longer. For the frontal area, it tended to be better when the degree of CA was severer at the first CT scan. There was no significant correlation between the rate of recovery and age or the duration of drinking habit in any of the ROIs. The beginning of CA recovery after abstinence is likely to depend on the brain sites; The cella media seemed to be the first to recover from CA. (Namekawa, K.)

  11. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  12. Hydrogen/deuterium exchange mass spectrometry and computational modeling reveal a discontinuous epitope of an antibody/TL1A Interaction.

    Science.gov (United States)

    Huang, Richard Y-C; Krystek, Stanley R; Felix, Nathan; Graziano, Robert F; Srinivasan, Mohan; Pashine, Achal; Chen, Guodong

    2018-01-01

    TL1A, a tumor necrosis factor-like cytokine, is a ligand for the death domain receptor DR3. TL1A, upon binding to DR3, can stimulate lymphocytes and trigger secretion of proinflammatory cytokines. Therefore, blockade of TL1A/DR3 interaction may be a potential therapeutic strategy for autoimmune and inflammatory diseases. Recently, the anti-TL1A monoclonal antibody 1 (mAb1) with a strong potency in blocking the TL1A/DR3 interaction was identified. Here, we report on the use of hydrogen/deuterium exchange mass spectrometry (HDX-MS) to obtain molecular-level details of mAb1's binding epitope on TL1A. HDX coupled with electron-transfer dissociation MS provided residue-level epitope information. The HDX dataset, in combination with solvent accessible surface area (SASA) analysis and computational modeling, revealed a discontinuous epitope within the predicted interaction interface of TL1A and DR3. The epitope regions span a distance within the approximate size of the variable domains of mAb1's heavy and light chains, indicating it uses a unique mechanism of action to block the TL1A/DR3 interaction.

  13. Computed tomographic study of hormone-secreting microadenomas

    International Nuclear Information System (INIS)

    Hemminghytt, S.; Kalkhoff, R.K.; Daniels, D.L.; Williams, A.L.; Grogan, J.P.; Haughton, V.M.

    1983-01-01

    A review was made of the computed tomographic (CT) studies of 33 patients with hormone-secreting microadenomas that had been verified by transsphenoidal surgery and endocrinologic evaluation. In previous studies in small series of patients, the CT appearance of pituitary microadenomas has been reported as hypodense, isodense, and hyperdense. In this study, CT showed a region of diminished enhancement and ususally an enlarged pituitary gland in cases of prolactin-secreting adenomas. HGH- or ACTH-secreting adenomas were less consistently hypodense. It is concluded that hypodensity and enlargement in the pituitary gland are the most useful criteria for identification of microadenomas. Some technical factors that may affect the CT appearance of microadenomas and lead to conflicting reports are discussed

  14. Computed tomography of stress fracture

    International Nuclear Information System (INIS)

    Murcia, M.; Brennan, R.E.; Edeiken, J.

    1982-01-01

    An athletic young female developed gradual onset of pain in the right leg. Plain radiographs demonstrated solid periosteal reaction in the tibia compatible with stress fracture. She stopped sport activites but her pain continued. Follow-up radiographs of the tibia revealed changes suspicious for osteoid osteoma. Computed tomography (CT) scan demonstrated periosteal reaction, but in addition, lucent fracture lines in the tibial cortex were evident. CT obviated the need for more invasive diagnostic procedures in this patient. In selected cases CT may be useful to confirm the diagnosis of stress fracture when plain radiographic or routine tomographic studies are not diagnostic. (orig.)

  15. Computed tomography of stress fracture

    International Nuclear Information System (INIS)

    Murcia, M.; Brennan, R.E.; Edeiken, J.

    1982-01-01

    An athletic young female developed gradual onset of pain in the right leg. Plain radiographs demonstrated solid periosteal reaction in the tibia compatible with stress fracture. She stopped sport activites but her pain continued. Follow-up radiographs of the tibia revealed changes suspicious for osteoid osteoma. Computed tomography (CT) scan demonstrated periosteal reaction, but in addition, lucent fracture lines in the tibial cortex were evident. CT obviated the need for more invasive diagnostic procedures in this patient. In selected cases CT may be useful to confirm the diagnosis of stress fracture when plain radiographic or routine tomographic studies are not diagnostic

  16. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    Science.gov (United States)

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  17. A Computer Game-Based Method for Studying Bullying and Cyberbullying

    Science.gov (United States)

    Mancilla-Caceres, Juan F.; Espelage, Dorothy; Amir, Eyal

    2015-01-01

    Even though previous studies have addressed the relation between face-to-face bullying and cyberbullying, none have studied both phenomena simultaneously. In this article, we present a computer game-based method to study both types of peer aggression among youth. Study participants included fifth graders (N = 93) in two U.S. Midwestern middle…

  18. Learning Support Assessment Study of a Computer Simulation for the Development of Microbial Identification Strategies

    Directory of Open Access Journals (Sweden)

    Tristan E. Johnson

    2009-12-01

    Full Text Available This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage—how the students used the software features and Problem-Solving Strategy Development—the strategy level students started with and the skill level they achieved when they completed their use of the simulation. Several conclusions emerged from the analysis of the data: (i The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii Identibacter (the computer simulation may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii The simulation characteristics did support student learning of a problem-solving strategy. (iv Students demonstrated problem-solving strategy development specific to bacterial identification. (v Participants demonstrated an improved performance from their repeated use of the simulation.

  19. Biomechanical effects of mobile computer location in a vehicle cab.

    Science.gov (United States)

    Saginus, Kyle A; Marklin, Richard W; Seeley, Patricia; Simoneau, Guy G; Freier, Stephen

    2011-10-01

    The objective of this research is to determine the best location to place a conventional mobile computer supported by a commercially available mount in a light truck cab. U.S. and Canadian electric utility companies are in the process of integrating mobile computers into their fleet vehicle cabs. There are no publications on the effect of mobile computer location in a vehicle cab on biomechanical loading, performance, and subjective assessment. The authors tested four locations of mobile computers in a light truck cab in a laboratory study to determine how location affected muscle activity of the lower back and shoulders; joint angles of the shoulders, elbows, and wrist; user performance; and subjective assessment. A total of 22 participants were tested in this study. Placing the mobile computer closer to the steering wheel reduced low back and shoulder muscle activity. Joint angles of the shoulders, elbows, and wrists were also closer to neutral angle. Biomechanical modeling revealed substantially less spinal compression and trunk muscle force. In general, there were no practical differences in performance between the locations. Subjective assessment indicated that users preferred the mobile computer to be as close as possible to the steering wheel. Locating the mobile computer close to the steering wheel reduces risk of injuries, such as low back pain and shoulder tendonitis. Results from the study can guide electric utility companies in the installation of mobile computers into vehicle cabs. Results may also be generalized to other industries that use trucklike vehicles, such as construction.

  20. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    DOE Office of Science, Biological and Environmental Research Program Office (BER),

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  1. Reverse logistics system planning for recycling computers hardware: A case study

    Science.gov (United States)

    Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar

    2014-09-01

    This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.

  2. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  3. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    Science.gov (United States)

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  4. Natural Carbonized Sugar as a Low-Temperature Ammonia Sensor Material: Experimental, Theoretical, and Computational Studies.

    Science.gov (United States)

    Ghule, Balaji G; Shaikh, Shoyebmohamad; Ekar, Satish U; Nakate, Umesh T; Gunturu, Krishna Chaitanya; Shinde, Nanasaheb M; Naushad, Mu; Kim, Kwang Ho; O'Dwyer, Colm; Mane, Rajaram S

    2017-12-13

    Carbonized sugar (CS) has been synthesized via microwave-assisted carbonization of market-quality tabletop sugar bearing in mind the advantages of this synthesis method, such as being useful, cost-effective, and eco-friendly. The as-prepared CS has been characterized for its morphology, phase purity, type of porosity, pore-size distribution, and so on. The gas-sensing properties of CS for various oxidizing and reducing gases are demonstrated at ambient temperature, where we observe good selectivity toward liquid ammonia among other gases. The highest ammonia response (50%) of a CS-based sensor was noted at 80 °C for 100 ppm concentration. The response and recovery times of the CS sensor are 180 and 216 s, respectively. This unveiling ammonia-sensing study is explored through a plausible theoretical mechanism, which is further well-supported by computational modeling performed using density function theory. The effect of relative humidity on the CS sensor has also been studied at ambient temperature, which demonstrated that the minimum and maximum (20-100%) relative humidity values revealed 16 and 62% response, respectively.

  5. HIGH-PERFORMANCE COMPUTING FOR THE STUDY OF EARTH AND ENVIRONMENTAL SCIENCE MATERIALS USING SYNCHROTRON X-RAY COMPUTED MICROTOMOGRAPHY

    International Nuclear Information System (INIS)

    FENG, H.; JONES, K.W.; MCGUIGAN, M.; SMITH, G.J.; SPILETIC, J.

    2001-01-01

    Synchrotron x-ray computed microtomography (CMT) is a non-destructive method for examination of rock, soil, and other types of samples studied in the earth and environmental sciences. The high x-ray intensities of the synchrotron source make possible the acquisition of tomographic volumes at a high rate that requires the application of high-performance computing techniques for data reconstruction to produce the three-dimensional volumes, for their visualization, and for data analysis. These problems are exacerbated by the need to share information between collaborators at widely separated locations over both local and tide-area networks. A summary of the CMT technique and examples of applications are given here together with a discussion of the applications of high-performance computing methods to improve the experimental techniques and analysis of the data

  6. HIGH-PERFORMANCE COMPUTING FOR THE STUDY OF EARTH AND ENVIRONMENTAL SCIENCE MATERIALS USING SYNCHROTRON X-RAY COMPUTED MICROTOMOGRAPHY.

    Energy Technology Data Exchange (ETDEWEB)

    FENG,H.; JONES,K.W.; MCGUIGAN,M.; SMITH,G.J.; SPILETIC,J.

    2001-10-12

    Synchrotron x-ray computed microtomography (CMT) is a non-destructive method for examination of rock, soil, and other types of samples studied in the earth and environmental sciences. The high x-ray intensities of the synchrotron source make possible the acquisition of tomographic volumes at a high rate that requires the application of high-performance computing techniques for data reconstruction to produce the three-dimensional volumes, for their visualization, and for data analysis. These problems are exacerbated by the need to share information between collaborators at widely separated locations over both local and tide-area networks. A summary of the CMT technique and examples of applications are given here together with a discussion of the applications of high-performance computing methods to improve the experimental techniques and analysis of the data.

  7. Computing Cosmic Cataclysms

    Science.gov (United States)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  8. A Reflective Study into Children's Cognition When Making Computer Games

    Science.gov (United States)

    Allsop, Yasemin

    2016-01-01

    In this paper, children's mental activities when making digital games are explored. Where previous studies have mainly focused on children's learning, this study aimed to unfold the children's thinking process for learning when making computer games. As part of an ongoing larger scale study, which adopts an ethnographic approach, this research…

  9. Asymmetrically increased rib cage uptake on bone scintigraphy: Incidental detection of pleural mesothelioma on single photon emission computed tomography/computed tomography

    International Nuclear Information System (INIS)

    Dhull, Varun Singh; Sharma, Punit; Durgapal, Prashant; Karunanithi, Sellam; Tripathi, Madhavi; Kumar, Rakesh

    2014-01-01

    Follow-up bone scintigraphy (BS) in a patient of carcinoma left breast, who was treated with surgery followed by radiotherapy 12 years back, revealed asymmetrically increased radiotracer uptake in left-sided ribs. Since, this pattern was atypical for metastatic rib involvement, single photon emission computed tomography/computed tomography (SPECT/CT) of thorax was done in the same setting which revealed circumferential nodular left-sided pleural thickening. Biopsy confirmed it to be pleural mesothelioma. Left-sided ribs showed no abnormality on CT, thus suggesting the rib uptake as reactive in nature. This pattern of asymmetric rib uptake on BS should be kept in mind and warrants further investigation for determining underlying pathology

  10. Utility of screening computed tomography of chest, abdomen and pelvis in patients after heart transplantation

    International Nuclear Information System (INIS)

    Dasari, Tarun W.; Pavlovic-Surjancev, Biljana; Dusek, Linda; Patel, Nilamkumar; Heroux, Alain L.

    2011-01-01

    Introduction: Malignancy is a late cause of mortality in heart transplant recipients. It is unknown if screening computed tomography scan would lead to early detection of such malignancies or serious vascular anomalies post heart transplantation. Methods: This is a single center observational study of patients undergoing surveillance computed tomography of chest, abdomen and pelvis atleast 5 years after transplantation. Abnormal findings, included pulmonary nodules, lymphadenopathy and intra-thoracic and intra-abdominal masses and vascular anomalies such as abdominal aortic aneurysm. The clinical follow up of each of these major abnormal findings is summarized. Results: A total of 63 patients underwent computed tomography scan of chest, abdomen and pelvis at least 5 years after transplantation. Of these, 54 (86%) were male and 9 (14%) were female. Mean age was 52 ± 9.2 years. Computed tomography revealed 1 lung cancer (squamous cell) only. Non specific pulmonary nodules were seen in 6 patients (9.5%). The most common incidental finding was abdominal aortic aneurysms (N = 6 (9.5%)), which necessitated follow up computed tomography (N = 5) or surgery (N = 1). Mean time to detection of abdominal aortic aneurysms from transplantation was 14.6 ± 4.2 years. Mean age at the time of detection of abdominal aortic aneurysms was 74.5 ± 3.2 years. Conclusion: Screening computed tomography scan in patients 5 years from transplantation revealed only one malignancy but lead to increased detection of abdominal aortic aneurysms. Thus the utility is low in terms of detection of malignancy. Based on this study we do not recommend routine computed tomography post heart transplantation.

  11. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration; Velikhov, Vasily; Konoplich, Rostislav

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in the modern high energy physics. It is important to notice, that GRID computing resources become strictly limited due to increasing amount of statistics, required for physics analyses and unprecedented LHC performance. One of the possibilities to address the shortfall of computing resources is the usage of computer institutes' clusters, commercial computing resources and supercomputers. To perform precision measurements of the Higgs boson properties in these realities, it is also highly required to have effective instruments to simulate kinematic distributions of signal events. In this talk we give a brief description of the modern distribution reconstruction method called Morphing and perform few efficiency tests to demonstrate its potential. These studies have been performed on the WLCG and Kurchatov Institute’s Data Processing Center, including Tier-1 GRID site and supercomputer as well. We also analyze the CPU efficienc...

  12. Computer-Based Job and Occupational Data Collection Methods: Feasibility Study

    National Research Council Canada - National Science Library

    Mitchell, Judith I

    1998-01-01

    .... The feasibility study was conducted to assess the operational and logistical problems involved with the development, implementation, and evaluation of computer-based job and occupational data collection methods...

  13. Computer users' ergonomics and quality of life - evidence from a developing country.

    Science.gov (United States)

    Ahmed, Ishfaq; Shaukat, Muhammad Zeeshan

    2018-06-01

    This study is aimed at investigating the quality of workplace ergonomics at various Pakistani organizations and quality of life of computer users working in these organizations. Two hundred and thirty-five computer users (only those employees who have to do most of their job tasks on computer or laptop, and at their office) responded by filling the questionnaire covering questions on workplace ergonomics and quality of life. Findings of the study revealed the ergonomics at those organizations was poor and unfavourable. The quality of life (both physical and mental health of the employees) of respondents was poor for employees who had unfavourable ergonomic environment. The findings thus highlight an important issue prevalent at Pakistani work settings.

  14. Computer simulation studies in condensed-matter physics 5. Proceedings

    International Nuclear Information System (INIS)

    Landau, D.P.; Mon, K.K.; Schuettler, H.B.

    1993-01-01

    As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs

  15. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  16. Radiological management of blunt polytrauma with computed tomography and angiography: an integrated approach

    Energy Technology Data Exchange (ETDEWEB)

    Kurdziel, J.C.; Dondelinger, R.F.; Hemmer, M.

    1987-01-01

    107 polytraumatized patients, who had experienced blunt trauma have been worked up at admission with computed tomography of the thorax, abdomen and pelvis following computed tomography study of the brain: significant lesions were revealed in 98 (90%) patients. 79 (74%) patients showed trauma to the thorax, in 69 (64%) patients abdominal or pelvic trauma was evidenced. No false positive diagnosis was established. 5 traumatic findings were missed. Emergency angiography was indicated in 3 (3%) patients, following computed tomography examination. 3 other trauma patients were submitted directly to angiography without computed tomography examination during the time period this study was completed. Embolization was carried out in 5/6 patients. No thoracotomy was needed. 13 (12%) patients underwent laparotomy following computed tomography. Overall mortality during hospital stay was 14% (15/107). No patient died from visceral bleeding. Conservative management of blunt polytrauma patients can be advocated in almost 90% of visceral lesions. Computed tomography coupled with angiography and embolization represent an adequate integrated approach to the management of blunt polytrauma patients.

  17. Radiological management of blunt polytrauma with computed tomography and angiography: an integrated approach

    International Nuclear Information System (INIS)

    Kurdziel, J.C.; Dondelinger, R.F.; Hemmer, M.

    1987-01-01

    107 polytraumatized patients, who had experienced blunt trauma have been worked up at admission with computed tomography of the thorax, abdomen and pelvis following computed tomography study of the brain: significant lesions were revealed in 98 (90%) patients. 79 (74%) patients showed trauma to the thorax, in 69 (64%) patients abdominal or pelvic trauma was evidenced. No false positive diagnosis was established. 5 traumatic findings were missed. Emergency angiography was indicated in 3 (3%) patients, following computed tomography examination. 3 other trauma patients were submitted directly to angiography without computed tomography examination during the time period this study was completed. Embolization was carried out in 5/6 patients. No thoracotomy was needed. 13 (12%) patients underwent laparotomy following computed tomography. Overall mortality during hospital stay was 14% (15/107). No patient died from visceral bleeding. Conservative management of blunt polytrauma patients can be advocated in almost 90% of visceral lesions. Computed tomography coupled with angiography and embolization represent an adequate integrated approach to the management of blunt polytrauma patients

  18. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    systems that would fall under the Exascale rubric . In this chapter, we first discuss the attributes by which achievement of the label “Exascale” may be...Carrington, and E. Strohmaier. A Genetic Algorithms Approach to Modeling the Performance of Memory-bound Computations. Reno, NV, November 2007. ACM/IEEE... genetic stochasticity (random mating, mutation, etc). Outcomes are thus stochastic as well, and ecologists wish to ask questions like, “What is the

  19. A Computer-Supported Method to Reveal and Assess Personal Professional Theories in Vocational Education

    Science.gov (United States)

    van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.

    2016-01-01

    This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…

  20. Implications of Ubiquitous Computing for the Social Studies Curriculum

    Science.gov (United States)

    van Hover, Stephanie D.; Berson, Michael J.; Bolick, Cheryl Mason; Swan, Kathleen Owings

    2004-01-01

    In March 2002, members of the National Technology Leadership Initiative (NTLI) met in Charlottesville, Virginia to discuss the potential effects of ubiquitous computing on the field of education. Ubiquitous computing, or "on-demand availability of task-necessary computing power," involves providing every student with a handheld computer--a…

  1. COMPARATIVE STUDY OF TERTIARY WASTEWATER TREATMENT BY COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Stefania Iordache

    2010-01-01

    Full Text Available The aim of this work is to asses conditions for implementation of a Biological Nutrient Removal (BNR process in theWastewater Treatment Plant (WWTP of Moreni city (Romania. In order to meet the more increased environmentalregulations, the wastewater treatment plant that was studied, must update the actual treatment process and have tomodernize it. A comparative study was undertaken of the quality of effluents that could be obtained by implementationof biological nutrient removal process like A2/O (Anaerobic/Anoxic/Oxic and VIP (Virginia Plant Initiative aswastewater tertiary treatments. In order to asses the efficiency of the proposed treatment schemata based on the datamonitored at the studied WWTP, it were realized computer models of biological nutrient removal configurations basedon A2/O and VIP process. Computer simulation was realized using a well-known simulator, BioWin by EnviroSimAssociates Ltd. The simulation process allowed to obtain some data that can be used in design of a tertiary treatmentstage at Moreni WWTP, in order to increase the efficiency in operation.

  2. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... computer modeling used as a research method applied in the process ... conclusions discuss the benefits for students who analyzed the ... accounting education process the case study method should not .... providing travel safety information to passengers ... from literature readings with practical problems.

  3. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  4. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  5. Computer Assisted Instruction in Special Education Three Case Studies

    OpenAIRE

    İbrahim DOĞAN; Ömür AKDEMİR

    2015-01-01

    The purpose of this study is to investigate the computer use of three students attending the special education center. Students have mental retardation, hearing problem and physical handicap respectively. The maximum variation sampling is used to select the type of handicap while the convenience sampling is used to select the participants. Three widely encountered handicap types in special education are chosen to select the study participants. The multiple holistic case study design is used i...

  6. Study on computer-aided simulation procedure for multicomponent separating cascade

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro

    1982-11-01

    The present report reviews the author's study on the computer-aided simulation procedure for a multicomponent separating cascade. As a conclusion, two very powerful simulation procedures have been developed for cascades composed of separating elements whose separation factors are very large. They are applicable in cases where interstage flow rates are input variables for the calculation and stage separation factors are given either as constants or as functions of compositions of the up and down streams. As an application of the new procedure, a computer-aided simulation study has been performed for hydrogen isotope separating cascades by porous membrane method. A cascade system configuration is developed and pertinent design specifications are determined in an example case of the feed conditions and separation requirements. (author)

  7. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  8. Comparative study between computed radiography and conventional radiography

    International Nuclear Information System (INIS)

    Noorhazleena Azaman; Khairul Anuar Mohd Salleh; Sapizah Rahim; Shaharudin Sayuti; Arshad Yassin; Abdul Razak Hamzah

    2010-01-01

    In Industrial Radiography, there are many criteria that need to be considered based on established standards to accept or reject the radiographic film. For conventional radiography, we need to consider the optical density by using the densitometer when viewing the film on the viewer. But in the computed radiography (CR) we need to evaluate and performed the analysis from the quality of the digital image through grey value. There are many factors that affected the digital image quality. One of the factors which are affected to the digital image quality in the image processing is grey value that related to the contrast resolution. In this work, we performed grey value study measurement on digital radiography systems and compared it with exposed films in conventional radiography. The test sample is a steel step wedge. We found out the contrast resolution is higher in Computed Radiography compared with Conventional Radiography. (author)

  9. Computer literacy among first year medical students in a developing country: A cross sectional study

    Science.gov (United States)

    2012-01-01

    Background The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p computer training was the strongest predictor of computer literacy (β = 13.034), followed by using

  10. Enhanced limonene production in cyanobacteria reveals photosynthesis limitations.

    Science.gov (United States)

    Wang, Xin; Liu, Wei; Xin, Changpeng; Zheng, Yi; Cheng, Yanbing; Sun, Su; Li, Runze; Zhu, Xin-Guang; Dai, Susie Y; Rentzepis, Peter M; Yuan, Joshua S

    2016-12-13

    Terpenes are the major secondary metabolites produced by plants, and have diverse industrial applications as pharmaceuticals, fragrance, solvents, and biofuels. Cyanobacteria are equipped with efficient carbon fixation mechanism, and are ideal cell factories to produce various fuel and chemical products. Past efforts to produce terpenes in photosynthetic organisms have gained only limited success. Here we engineered the cyanobacterium Synechococcus elongatus PCC 7942 to efficiently produce limonene through modeling guided study. Computational modeling of limonene flux in response to photosynthetic output has revealed the downstream terpene synthase as a key metabolic flux-controlling node in the MEP (2-C-methyl-d-erythritol 4-phosphate) pathway-derived terpene biosynthesis. By enhancing the downstream limonene carbon sink, we achieved over 100-fold increase in limonene productivity, in contrast to the marginal increase achieved through stepwise metabolic engineering. The establishment of a strong limonene flux revealed potential synergy between photosynthate output and terpene biosynthesis, leading to enhanced carbon flux into the MEP pathway. Moreover, we show that enhanced limonene flux would lead to NADPH accumulation, and slow down photosynthesis electron flow. Fine-tuning ATP/NADPH toward terpene biosynthesis could be a key parameter to adapt photosynthesis to support biofuel/bioproduct production in cyanobacteria.

  11. The effect of using in computer skills on teachers’ perceived self-efficacy beliefs towards technology integration, attitudes and performance

    Directory of Open Access Journals (Sweden)

    Badrie Mohammad Nour ELDaou

    2016-10-01

    Full Text Available The current study analyzesthe relationship between the apparentteacher’s Self-efficacyand attitudes towardsintegrating technology into classroom teaching, self-evaluation reportsand computer performance results. Pre-post measurement of the Computer Technology Integration Survey (CTIS (Wang et al, 2004 was used to determine theconfidence level with of 60 science teachers and 12 mixed-major teachers enrolled at the Lebanese University, Faculty of Education in the academic year 2011-2012. Pre –post measurement onteachers’attitudes towards usingtechnologywas examined using an opened and a closed questionnaire.Teachers’ performance was measured by means of their Activeinspire projects results using active boards after their third practice of training in computer skills and Activeinspire program. To accumulate data on teachers’ self-report, this study uses Robert Reasoner's five components: feeling of security, feeling of belonging, feeling of identity, feeling of goal, and self-actualization which teachers used to rate themselves (Reasoner,1983. The study acknowledged probable impacts of computer training skills on teachers ‘self-evaluation report, effectiveness of computer technology skills, and evaluations of self-efficacy attitudes toward technology integration. Pearson correlation revealed a strong relationship r= 0.99 between the perceived self-efficacy towards technology incorporation and teachers’ self-evaluation report. Also, the findings of this research revealed that 82.7% of teachers earned high computer technology scores on their Activeinspire projects and 33.3% received excellent grades on computer performance test. Recommendations and potential research were discussed

  12. The Effect of Using in Computer Skills on Teachers’ Perceived Self-Efficacy Beliefs Towards Technology Integration, Attitudes and Performance

    Directory of Open Access Journals (Sweden)

    Badrie Mohammad Nour EL-Daou

    2016-07-01

    Full Text Available The current study analyzes the relationship between the apparent teacher’s Self-efficacy and attitudes towards integrating technology into classroom teaching, self- evaluation reports and computer performance results. Pre-post measurement of the Computer Technology Integration Survey (CTIS (Wang et al,2004 was used to determine the confidence level with of 60 science teachers and 12 mixed-major teachers enrolled at the Lebanese University, Faculty of Education in the academic year 2011-2012. Pre –post measurement on teachers’ attitudes towards using technology was examined using an opened and a closed questionnaire. Teachers’ performance was measured by means of their Activeinspire projects results using active boards after their third practice of training in computer skills and Activeinspire program. To accumulate data on teachers’ self-report, this study uses Robert Reasoner's five components: feeling of security, feeling of belonging, feeling of identity, feeling of goal, and self-actualization which teachers used to rate themselves (Reasoner,1983. The study acknowledged probable impacts of computer training skills on teachers ‘self-evaluation report, effectiveness of computer technology skills, and evaluations of self-efficacy attitudes toward technology integration. Pearson correlation revealed a strong relationship r = 0.99 between the perceived self-efficacy towards technology incorporation and teachers’ self-evaluation report. Also, the findings of this research revealed that 82.7% of teachers earned high computer technology scores on their Activeinspire projects and 33.3% received excellent grades on computer performance test. Recommendations and potential research were discussed.

  13. Incorporating Computers into Classroom: Effects on Learners’ Reading Comprehension in EFL Context

    Directory of Open Access Journals (Sweden)

    Ali Akbar Ansarin

    2017-10-01

    Full Text Available Owing to the importance of computer-assisted reading and considering the prominent role of learners in this respect, the present study investigated: (1 the effects of computer as a supplemental tool to support and improve the Iranian EFL learners’ reading comprehension in comparison with equivalent non-technological or traditional print-based treatments, (2 EFL learners’ attitudes and perception towards the computer-assisted reading course.To this purpose, 111 randomly selected groups of EFL learners participated in the study. The subjects were divided into two groups of control and experimental. Both groups received 10 reading lessons either through computers or through an instructor-led method. The statistical analysis revealed no significant difference between the learners who had access to reading supports on computer screen and their counterparts in the traditional reading classes. Learners were also allowed to express their ideas on a 5-point Likert Scale. The purpose of the attitude questionnaire was to find out more information about the participants and their experiences with computer-assisted reading. Results of attitude questionnaire supported the conclusion that computers may enhance EFL learners’ motivation and interest towards learning but they do not enhance comprehension. The findings of this study support the view that technology should supplement not supplant teachers and that people read less accurately and less comprehensively on screens than on paper.

  14. Computational study of chain transfer to monomer reactions in high-temperature polymerization of alkyl acrylates.

    Science.gov (United States)

    Moghadam, Nazanin; Liu, Shi; Srinivasan, Sriraj; Grady, Michael C; Soroush, Masoud; Rappe, Andrew M

    2013-03-28

    This article presents a computational study of chain transfer to monomer (CTM) reactions in self-initiated high-temperature homopolymerization of alkyl acrylates (methyl, ethyl, and n-butyl acrylate). Several mechanisms of CTM are studied. The effects of the length of live polymer chains and the type of monoradical that initiated the live polymer chains on the energy barriers and rate constants of the involved reaction steps are investigated theoretically. All calculations are carried out using density functional theory. Three types of hybrid functionals (B3LYP, X3LYP, and M06-2X) and four basis sets (6-31G(d), 6-31G(d,p), 6-311G(d), and 6-311G(d,p)) are applied to predict the molecular geometries of the reactants, products and transition sates, and energy barriers. Transition state theory is used to estimate rate constants. The results indicate that abstraction of a hydrogen atom (by live polymer chains) from the methyl group in methyl acrylate, the methylene group in ethyl acrylate, and methylene groups in n-butyl acrylate are the most likely mechanisms of CTM. Also, the rate constants of CTM reactions calculated using M06-2X are in good agreement with those estimated from polymer sample measurements using macroscopic mechanistic models. The rate constant values do not change significantly with the length of live polymer chains. Abstraction of a hydrogen atom by a tertiary radical has a higher energy barrier than abstraction by a secondary radical, which agrees with experimental findings. The calculated and experimental NMR spectra of dead polymer chains produced by CTM reactions are comparable. This theoretical/computational study reveals that CTM occurs most likely via hydrogen abstraction by live polymer chains from the methyl group of methyl acrylate and methylene group(s) of ethyl (n-butyl) acrylate.

  15. Computer-Assisted Instruction: A Case Study of Two Charter Schools

    Science.gov (United States)

    Keengwe, Jared; Hussein, Farhan

    2013-01-01

    The purpose of this study was to examine the relationship in achievement gap between English language learners (ELLs) utilizing computer-assisted instruction (CAI) in the classroom, and ELLs relying solely on traditional classroom instruction. The study findings showed that students using CAI to supplement traditional lectures performed better…

  16. Computer vision syndrome and associated factors among medical and engineering students in chennai.

    Science.gov (United States)

    Logaraj, M; Madhupriya, V; Hegde, Sk

    2014-03-01

    Almost all institutions, colleges, universities and homes today were using computer regularly. Very little research has been carried out on Indian users especially among college students the effects of computer use on the eye and vision related problems. The aim of this study was to assess the prevalence of computer vision syndrome (CVS) among medical and engineering students and the factors associated with the same. A cross-sectional study was conducted among medical and engineering college students of a University situated in the suburban area of Chennai. Students who used computer in the month preceding the date of study were included in the study. The participants were surveyed using pre-tested structured questionnaire. Among engineering students, the prevalence of CVS was found to be 81.9% (176/215) while among medical students; it was found to be 78.6% (158/201). A significantly higher proportion of engineering students 40.9% (88/215) used computers for 4-6 h/day as compared to medical students 10% (20/201) (P medical students. Students who used computer for 4-6 h were at significantly higher risk of developing redness (OR = 1.2, 95% CI = 1.0-3.1,P = 0.04), burning sensation (OR = 2.1,95% CI = 1.3-3.1, P computer for less than 4 h. Significant correlation was found between increased hours of computer use and the symptoms redness, burning sensation, blurred vision and dry eyes. The present study revealed that more than three-fourth of the students complained of any one of the symptoms of CVS while working on the computer.

  17. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study

    OpenAIRE

    Ellis, Mary Kay

    1993-01-01

    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...

  18. Steam generator transient studies using a simplified two-fluid computer code

    International Nuclear Information System (INIS)

    Munshi, P.; Bhatnagar, R.; Ram, K.S.

    1985-01-01

    A simplified two-fluid computer code has been used to simulate reactor-side (or primary-side) transients in a PWR steam generator. The disturbances are modelled as ramp inputs for pressure, internal energy and mass flow-rate for the primary fluid. The CPU time for a transient duration of 4 s is approx. 10 min on a DEC-1090 computer system. The results are thermodynamically consistent and encouraging for further studies. (author)

  19. A detailed experimental study of a DNA computer with two endonucleases.

    Science.gov (United States)

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  20. · Attitude towards Computers and Classroom Management of Language School Teachers

    Directory of Open Access Journals (Sweden)

    Sara Jalali

    2014-07-01

    Full Text Available Computer-assisted language learning (CALL is the realization of computers in schools and universities which has potentially enhanced the language learning experience inside the classrooms. The integration of the technologies into the classroom demands that the teachers adopt a number of classroom management procedures to maintain a more learner-centered and conducive language learning environment. The current study explored the relationship between computer attitudes and behavior and instructional classroom management approaches implemented by English institute teachers. In so doing, a total of 105 male (n = 27 and female (n = 78 EFL teachers participated in this study. A computer attitude questionnaire adapted from Albirini (2006 and a Behavior and Instructional Management Scale (BIMS adopted from Martin and Sass (2010 were benefitted from for the purpose of collecting the data. The results of the Pearson Correlation Coefficient revealed that there were no significant relationships between attitude and behavior and instructional management across gender. However, it was found that the more male teachers experience tendency toward using computers in their classes, the more teacher-centered their classes become. In addition, the more female teachers are prone to use computers in their classes, the more student-centered and lenient their classes become.

  1. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  2. In-cylinder diesel spray combustion simulations using parallel computation: A performance benchmarking study

    International Nuclear Information System (INIS)

    Pang, Kar Mun; Ng, Hoon Kiat; Gan, Suyin

    2012-01-01

    Highlights: ► A performance benchmarking exercise is conducted for diesel combustion simulations. ► The reduced chemical mechanism shows its advantages over base and skeletal models. ► High efficiency and great reduction of CPU runtime are achieved through 4-node solver. ► Increasing ISAT memory from 0.1 to 2 GB reduces the CPU runtime by almost 35%. ► Combustion and soot processes are predicted well with minimal computational cost. - Abstract: In the present study, in-cylinder diesel combustion simulation was performed with parallel processing on an Intel Xeon Quad-Core platform to allow both fluid dynamics and chemical kinetics of the surrogate diesel fuel model to be solved simultaneously on multiple processors. Here, Cartesian Z-Coordinate was selected as the most appropriate partitioning algorithm since it computationally bisects the domain such that the dynamic load associated with fuel particle tracking was evenly distributed during parallel computations. Other variables examined included number of compute nodes, chemistry sizes and in situ adaptive tabulation (ISAT) parameters. Based on the performance benchmarking test conducted, parallel configuration of 4-compute node was found to reduce the computational runtime most efficiently whereby a parallel efficiency of up to 75.4% was achieved. The simulation results also indicated that accuracy level was insensitive to the number of partitions or the partitioning algorithms. The effect of reducing the number of species on computational runtime was observed to be more significant than reducing the number of reactions. Besides, the study showed that an increase in the ISAT maximum storage of up to 2 GB reduced the computational runtime by 50%. Also, the ISAT error tolerance of 10 −3 was chosen to strike a balance between results accuracy and computational runtime. The optimised parameters in parallel processing and ISAT, as well as the use of the in-house reduced chemistry model allowed accurate

  3. Population coding and decoding in a neural field: a computational study.

    Science.gov (United States)

    Wu, Si; Amari, Shun-Ichi; Nakahara, Hiroyuki

    2002-05-01

    This study uses a neural field model to investigate computational aspects of population coding and decoding when the stimulus is a single variable. A general prototype model for the encoding process is proposed, in which neural responses are correlated, with strength specified by a gaussian function of their difference in preferred stimuli. Based on the model, we study the effect of correlation on the Fisher information, compare the performances of three decoding methods that differ in the amount of encoding information being used, and investigate the implementation of the three methods by using a recurrent network. This study not only rediscovers main results in existing literatures in a unified way, but also reveals important new features, especially when the neural correlation is strong. As the neural correlation of firing becomes larger, the Fisher information decreases drastically. We confirm that as the width of correlation increases, the Fisher information saturates and no longer increases in proportion to the number of neurons. However, we prove that as the width increases further--wider than (sqrt)2 times the effective width of the turning function--the Fisher information increases again, and it increases without limit in proportion to the number of neurons. Furthermore, we clarify the asymptotic efficiency of the maximum likelihood inference (MLI) type of decoding methods for correlated neural signals. It shows that when the correlation covers a nonlocal range of population (excepting the uniform correlation and when the noise is extremely small), the MLI type of method, whose decoding error satisfies the Cauchy-type distribution, is not asymptotically efficient. This implies that the variance is no longer adequate to measure decoding accuracy.

  4. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  5. Computational study of the fibril organization of polyglutamine repeats reveals a common motif identified in beta-helices.

    Science.gov (United States)

    Zanuy, David; Gunasekaran, Kannan; Lesk, Arthur M; Nussinov, Ruth

    2006-04-21

    The formation of fibril aggregates by long polyglutamine sequences is assumed to play a major role in neurodegenerative diseases such as Huntington. Here, we model peptides rich in glutamine, through a series of molecular dynamics simulations. Starting from a rigid nanotube-like conformation, we have obtained a new conformational template that shares structural features of a tubular helix and of a beta-helix conformational organization. Our new model can be described as a super-helical arrangement of flat beta-sheet segments linked by planar turns or bends. Interestingly, our comprehensive analysis of the Protein Data Bank reveals that this is a common motif in beta-helices (termed beta-bend), although it has not been identified so far. The motif is based on the alternation of beta-sheet and helical conformation as the protein sequence is followed from the N to the C termini (beta-alpha(R)-beta-polyPro-beta). We further identify this motif in the ssNMR structure of the protofibril of the amyloidogenic peptide Abeta(1-40). The recurrence of the beta-bend suggests a general mode of connecting long parallel beta-sheet segments that would allow the growth of partially ordered fibril structures. The design allows the peptide backbone to change direction with a minimal loss of main chain hydrogen bonds. The identification of a coherent organization beyond that of the beta-sheet segments in different folds rich in parallel beta-sheets suggests a higher degree of ordered structure in protein fibrils, in agreement with their low solubility and dense molecular packing.

  6. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  7. Computed tomography scanner applied to soil compaction studies

    International Nuclear Information System (INIS)

    Vaz, C.M.P.

    1989-11-01

    The soil compaction problem was studied using a first generation computed tomography scanner (CT). This apparatus gets images of soil cross sections samples, with resolution of a few millimeters. We performed the following laboratory and field experiments: basic experiments of equipment calibrations and resolutions studies; measurements of compacted soil thin layers; measurements of soil compaction caused by agricultural tools; stress-strain modelling in confined soil sample, with several moisture degree; characterizations of soil bulk density profile with samples collected in a hole (trench), comparing with a cone penetrometer technique. (author)

  8. Tropical pulmonary eosinophilia: a comparative evaluation of plain chest radiography and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sandhu Manavijit; Mukhopadhyay Sima; Sharma, S.K. [All India Inst. of Medical Sciences, New Delhi (India). Dept. of Nuclear Medicine

    1996-02-01

    Plain chest radiography and computed tomography (CT) of the chest were performed on 10 patients with tropical pulmonary eosinophilia (TPE). Chest radiographs revealed bilateral diffuse lesions in the lungs of all the patients with relative sparing of lower lobes in one patient. However, computed tomography revealed bilateral diffuse lung lesions in all of the patients with relative sparing of lower lobes in three patients. In seven (70%) of the 10 patients, CT provided additional information. Computed tomography was found to be superior for the detection of reticulonodular pattern, bronchiectasis, air trapping, calcification and mediastinal adenopathy. No correlation was found between pulmonary function and gas exchange data using CT densities. There was also no correlation between the absolute eosinophil count (AEC) and the radiological severity of lesions. In six patients, high-resolution CT (HRCT) was performed in addition to conventional CT (CCT), and nodularity of lesions was better appreciated in these patients. It is concluded from this study that CT is superior to plain radiography for the evaluation of patients with TPE. 17 refs., 2 tabs., 4 figs.

  9. A comparative study of three-dimensional reconstructive images of temporomandibular joint using computed tomogram

    International Nuclear Information System (INIS)

    Lim, Suk Young; Koh, Kwang Joon

    1993-01-01

    The purpose of this study was to clarify the spatial relationship of temporomandibular joint and to an aid in the diagnosis of temporomandibular disorder. For this study, three-dimensional images of normal temporomandibular joint were reconstructed by computer image analysis system and three-dimensional reconstructive program integrated in computed tomography. The obtained results were as follows : 1. Two-dimensional computed tomograms had the better resolution than three dimensional computed tomograms in the evaluation of bone structure and the disk of TMJ. 2. Direct sagittal computed tomograms and coronal computed tomograms had the better resolution in the evaluation of the disk of TMJ. 3. The positional relationship of the disk could be visualized, but the configuration of the disk could not be clearly visualized on three-dimensional reconstructive CT images. 4. Three-dimensional reconstructive CT images had the smoother margin than three-dimensional images reconstructed by computer image analysis system, but the images of the latter had the better perspective. 5. Three-dimensional reconstructive images had the better spatial relationship of the TMJ articulation, and the joint space were more clearly visualized on dissection images.

  10. Custom-Made Computer-Aided-Design/Computer-Aided-Manufacturing Biphasic Calcium-Phosphate Scaffold for Augmentation of an Atrophic Mandibular Anterior Ridge

    Directory of Open Access Journals (Sweden)

    Francesco Guido Mangano

    2015-01-01

    Full Text Available This report documents the clinical, radiographic, and histologic outcome of a custom-made computer-aided-design/computer-aided-manufactured (CAD/CAM scaffold used for the alveolar ridge augmentation of a severely atrophic anterior mandible. Computed tomographic (CT images of an atrophic anterior mandible were acquired and modified into a 3-dimensional (3D reconstruction model; this was transferred to a CAD program, where a custom-made scaffold was designed. CAM software generated a set of tool-paths for the manufacture of the scaffold on a computer-numerical-control milling machine into the exact shape of the 3D design. A custom-made scaffold was milled from a synthetic micromacroporous biphasic calcium phosphate (BCP block. The scaffold closely matched the shape of the defect: this helped to reduce the time for the surgery and contributed to good healing. One year later, newly formed and well-integrated bone was clinically available, and two implants (AnyRidge, MegaGen, Gyeongbuk, South Korea were placed. The histologic samples retrieved from the implant sites revealed compact mature bone undergoing remodelling, marrow spaces, and newly formed trabecular bone surrounded by residual BCP particles. This study demonstrates that custom-made scaffolds can be fabricated by combining CT scans and CAD/CAM techniques. Further studies on a larger sample of patients are needed to confirm these results.

  11. The traveling salesman problem a computational study

    CERN Document Server

    Applegate, David L; Chvatal, Vasek; Cook, William J

    2006-01-01

    This book presents the latest findings on one of the most intensely investigated subjects in computational mathematics--the traveling salesman problem. It sounds simple enough: given a set of cities and the cost of travel between each pair of them, the problem challenges you to find the cheapest route by which to visit all the cities and return home to where you began. Though seemingly modest, this exercise has inspired studies by mathematicians, chemists, and physicists. Teachers use it in the classroom. It has practical applications in genetics, telecommunications, and neuroscience.

  12. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  13. Experimental and Computational Study of Ductile Fracture in Small Punch Tests

    Directory of Open Access Journals (Sweden)

    Betül Gülçimen Çakan

    2017-10-01

    Full Text Available A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  14. Experimental and Computational Study of Ductile Fracture in Small Punch Tests.

    Science.gov (United States)

    Gülçimen Çakan, Betül; Soyarslan, Celal; Bargmann, Swantje; Hähner, Peter

    2017-10-17

    A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  15. Logic as Marr's Computational Level: Four Case Studies.

    Science.gov (United States)

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  16. Computer literacy among first year medical students in a developing country: A cross sectional study

    Directory of Open Access Journals (Sweden)

    Ranasinghe Priyanga

    2012-09-01

    Full Text Available Abstract Background The use of computer assisted learning (CAL has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190 were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%, 49.7% were Males. Majority of the students (77.3% owned a computer (Males-74.4%, Females-80.2%. Students have gained their present computer knowledge by; a formal training programme (64.1%, self learning (63.0% or by peer learning (49.2%. The students used computers for predominately; word processing (95.6%, entertainment (95.0%, web browsing (80.1% and preparing presentations (76.8%. Majority of the students (75.7% expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6. There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p Conclusion Sri Lankan medical undergraduates had a low-intermediate level of computer

  17. Obscure pulmonary masses: bronchial impaction revealed by CT

    International Nuclear Information System (INIS)

    Pugatch, R.D.; Gale, M.E.

    1983-01-01

    Dilated bronchi impacted with mucus or tumor are recognized on standard chest radiographs because they are surrounded by aerated pulmonary parenchyma. When imaged in different projections, these lesions produce a variety of appearances that are generally familiar. This report characterizes less familiar computed tomographic (CT) findings in eight patients with pathologic bronchial distension of congenital, neoplastic, or infectious etiologies and correlates them with chest films. In seven patients, CT readily revealed dilated bronchi and/or regional lung hypodensity. In four of these cases, CT led to the initial suspicion of dilated bronchi. CT should be used early in the evaluation of atypical pulmonary mass lesions or to confirm suspected bronchial impaction because of the high probability it will reveal diagnostic features

  18. Is the bipyridyl thorium metallocene a low-valent thorium complex? A combined experimental and computational study

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Wenshan; Lukens, Wayne W.; Zi, Guofu; Maron, Laurent; Walter, Marc D.

    2012-01-12

    Bipyridyl thorium metallocenes [5-1,2,4-(Me3C)3C5H2]2Th(bipy) (1) and [5-1,3-(Me3C)2C5H3]2Th(bipy) (2) have been investigated by magnetic susceptibility and computational studies. The magnetic susceptibility data reveal that 1 and 2 are not diamagnetic, but they behave as temperature independent paramagnets (TIPs). To rationalize this observation, density functional theory (DFT) and complete active space SCF (CASSCF) calculations have been undertaken, which indicated that Cp2Th(bipy) has indeed a Th(IV)(bipy2-) ground state (f0d0 2, S = 0), but the open-shell singlet (f0d1 1, S = 0) (almost degenerate with its triplet congener) is lying only 9.2 kcal/mol higher in energy. Complexes 1 and 2 react cleanly with Ph2CS to give [ 5-1,2,4-(Me3C)3C5H2]2Th[(bipy)(SCPh2)] (3) and [ 5-1,3-(Me3C)2C5H3]2Th[(bipy)(SCPh2)] (4), respectively, in quantitative conversions. Since no intermediates were observed experimentally, this reaction was also studied computationally. Coordination of Ph2CS to 2 in its S = 0 ground state is not possible, but Ph2CS can coordinate to 2 in its triplet state (S = 1) upon which a single electron transfer (SET) from the (bipy2-) fragment to Ph2CS followed by C-C coupling takes place.

  19. Preoperative irradiation of an extracerebral cavernous hemangioma in the middle fossa. Follow-up study with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, S; Kurihara, M; Mori, K [Nagasaki Univ. (Japan). School of Medicine; Amamoto, Y

    1981-02-01

    This is a report of case with the extracerebral cavernous hemangioma in the middle fossa in which total removal was carried out after radiotherapy. Follow-up study with computed tomography during and after irradiation are presented. A 44-year-old house-wife complained of a decreased vision of the both eyes and paresis of the left upper and lower limbs. CT scan revealed a slightly high density area in the right middle cranial fossa which was markedly enhanced with contrast media. Right carotid angio-graphy demonstrated a large avascular mass in the right middle fossa and no feeding artery or draining vein was visualized except a faint irregular stain in the venous phase. An attempt to total removal of the tumor had failed to succeed because of extensive hemorrhage from the tumor. Histological examination revealed a cavernous hemangioma. Irradiation with a total dose of 5000 rads was delivered. After irradiation CT scan revealed a marked decrease of size and EMI number of the tumor. At this stage, hypervascular mass lesion with feeding arteries was noted in conventional angiography. Tumor stain in prolonged injection angiography was also visualized. In the second operation, removal of the tumor was performed without any difficulty and hemorrhage was controlled easily by electrocoagulation. Histology revealed a marked narrowing of vessels with an increase in the connective tissues. In the central part of specimen, there noted findings of coagulation necrosis, intraluminal thrombus formations and so on, which were attributed to the influence of radiation. It is concluded that in case of an extracerebral cavernous hemangioma with massive hemorrhage, radiation of up to 3000 - 5000 rads was a method of choice. The treatment results in an increase of probability of total removal of the tumor.

  20. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    Science.gov (United States)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  1. Reciprocal Questioning and Computer-based Instruction in Introductory Auditing: Student Perceptions.

    Science.gov (United States)

    Watters, Mike

    2000-01-01

    An auditing course used reciprocal questioning (Socratic method) and computer-based instruction. Separate evaluations by 67 students revealed a strong aversion to the Socratic method; students expected professors to lecture. They showed a strong preference for the computer-based assignment. (SK)

  2. Handheld computers for self-administered sensitive data collection: A comparative study in Peru

    Directory of Open Access Journals (Sweden)

    Hughes James P

    2008-03-01

    Full Text Available Abstract Background Low-cost handheld computers (PDA potentially represent an efficient tool for collecting sensitive data in surveys. The goal of this study is to evaluate the quality of sexual behavior data collected with handheld computers in comparison with paper-based questionnaires. Methods A PDA-based program for data collection was developed using Open-Source tools. In two cross-sectional studies, we compared data concerning sexual behavior collected with paper forms to data collected with PDA-based forms in Ancon (Lima. Results The first study enrolled 200 participants (18–29 years. General agreement between data collected with paper format and handheld computers was 86%. Categorical variables agreement was between 70.5% and 98.5% (Kappa: 0.43–0.86 while numeric variables agreement was between 57.1% and 79.8% (Spearman: 0.76–0.95. Agreement and correlation were higher in those who had completed at least high school than those with less education. The second study enrolled 198 participants. Rates of responses to sensitive questions were similar between both kinds of questionnaires. However, the number of inconsistencies (p = 0.0001 and missing values (p = 0.001 were significantly higher in paper questionnaires. Conclusion This study showed the value of the use of handheld computers for collecting sensitive data, since a high level of agreement between paper and PDA responses was reached. In addition, a lower number of inconsistencies and missing values were found with the PDA-based system. This study has demonstrated that it is feasible to develop a low-cost application for handheld computers, and that PDAs are feasible alternatives for collecting field data in a developing country.

  3. Heavy Lift Vehicle (HLV) Avionics Flight Computing Architecture Study

    Science.gov (United States)

    Hodson, Robert F.; Chen, Yuan; Morgan, Dwayne R.; Butler, A. Marc; Sdhuh, Joseph M.; Petelle, Jennifer K.; Gwaltney, David A.; Coe, Lisa D.; Koelbl, Terry G.; Nguyen, Hai D.

    2011-01-01

    A NASA multi-Center study team was assembled from LaRC, MSFC, KSC, JSC and WFF to examine potential flight computing architectures for a Heavy Lift Vehicle (HLV) to better understand avionics drivers. The study examined Design Reference Missions (DRMs) and vehicle requirements that could impact the vehicles avionics. The study considered multiple self-checking and voting architectural variants and examined reliability, fault-tolerance, mass, power, and redundancy management impacts. Furthermore, a goal of the study was to develop the skills and tools needed to rapidly assess additional architectures should requirements or assumptions change.

  4. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    Science.gov (United States)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  5. Study on the application of mobile internet cloud computing platform

    Science.gov (United States)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  6. Large-scale computations on histology images reveal grade-differentiating parameters for breast cancer

    Directory of Open Access Journals (Sweden)

    Katsinis Constantine

    2006-10-01

    Full Text Available Abstract Background Tumor classification is inexact and largely dependent on the qualitative pathological examination of the images of the tumor tissue slides. In this study, our aim was to develop an automated computational method to classify Hematoxylin and Eosin (H&E stained tissue sections based on cancer tissue texture features. Methods Image processing of histology slide images was used to detect and identify adipose tissue, extracellular matrix, morphologically distinct cell nuclei types, and the tubular architecture. The texture parameters derived from image analysis were then applied to classify images in a supervised classification scheme using histologic grade of a testing set as guidance. Results The histologic grade assigned by pathologists to invasive breast carcinoma images strongly correlated with both the presence and extent of cell nuclei with dispersed chromatin and the architecture, specifically the extent of presence of tubular cross sections. The two parameters that differentiated tumor grade found in this study were (1 the number density of cell nuclei with dispersed chromatin and (2 the number density of tubular cross sections identified through image processing as white blobs that were surrounded by a continuous string of cell nuclei. Classification based on subdivisions of a whole slide image containing a high concentration of cancer cell nuclei consistently agreed with the grade classification of the entire slide. Conclusion The automated image analysis and classification presented in this study demonstrate the feasibility of developing clinically relevant classification of histology images based on micro- texture. This method provides pathologists an invaluable quantitative tool for evaluation of the components of the Nottingham system for breast tumor grading and avoid intra-observer variability thus increasing the consistency of the decision-making process.

  7. Computational Studies of Ionic Liquids

    National Research Council Canada - National Science Library

    Boatz, Jerry

    2004-01-01

    The structures and relative energies of the six possible N-protonated structures of the 1,5-diamino-1,2,3,4-tetrazolium cation have been computed at the B3LYP(3)/6-311G(d,p) and MP2/6-311G(d,p) levels of theory...

  8. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  9. Computational Study of Hypersonic Boundary Layer Stability on Cones

    Science.gov (United States)

    Gronvall, Joel Edwin

    Due to the complex nature of boundary layer laminar-turbulent transition in hypersonic flows and the resultant effect on the design of re-entry vehicles, there remains considerable interest in developing a deeper understanding of the underlying physics. To that end, the use of experimental observations and computational analysis in a complementary manner will provide the greatest insights. It is the intent of this work to provide such an analysis for two ongoing experimental investigations. The first focuses on the hypersonic boundary layer transition experiments for a slender cone that are being conducted at JAXA's free-piston shock tunnel HIEST facility. Of particular interest are the measurements of disturbance frequencies associated with transition at high enthalpies. The computational analysis provided for these cases included two-dimensional CFD mean flow solutions for use in boundary layer stability analyses. The disturbances in the boundary layer were calculated using the linear parabolized stability equations. Estimates for transition locations, comparisons of measured disturbance frequencies and computed frequencies, and a determination of the type of disturbances present were made. It was found that for the cases where the disturbances were measured at locations where the flow was still laminar but nearly transitional, that the highly amplified disturbances showed reasonable agreement with the computations. Additionally, an investigation of the effects of finite-rate chemistry and vibrational excitation on flows over cones was conducted for a set of theoretical operational conditions at the HIEST facility. The second study focuses on transition in three-dimensional hypersonic boundary layers, and for this the cone at angle of attack experiments being conducted at the Boeing/AFOSR Mach-6 quiet tunnel at Purdue University were examined. Specifically, the effect of surface roughness on the development of the stationary crossflow instability are investigated

  10. Advances in computer applications in radioactive tracer studies of the circulation

    International Nuclear Information System (INIS)

    Wagner, H.N. Jr.; Klingensmith, W.C. III; Knowles, L.G.; Lotter, M.G.; Natarajan, T.K.

    1977-01-01

    Advances in computer technology since the last IAEA symposium on medical radionuclide imaging have now made possible a new approach to the study of physiological processes that promise to improve greatly our perception of body functions and structures. We have developed procedures, called ''compressed time imaging'' (CTI), that display serial images obtained over periods of minutes and hours at framing rates of approximately 16 to 60 per minute. At other times, ''real'' or ''expanded time imaging'' is used, depending on the process under study. Designed initially to study the beating heart, such multidimensional time studies are now being extended to the cerebral and other regional circulations, as well as to other organ systems. The improved imaging methods provide a new approach to space and time in the study of physiology and are supplemented by quantitative analysis of data displayed on the television screen of the computer. (author)

  11. Computing UV/vis spectra using a combined molecular dynamics and quantum chemistry approach: bis-triazin-pyridine (BTP) ligands studied in solution.

    Science.gov (United States)

    Höfener, Sebastian; Trumm, Michael; Koke, Carsten; Heuser, Johannes; Ekström, Ulf; Skerencak-Frech, Andrej; Schimmelpfennig, Bernd; Panak, Petra J

    2016-03-21

    We report a combined computational and experimental study to investigate the UV/vis spectra of 2,6-bis(5,6-dialkyl-1,2,4-triazin-3-yl)pyridine (BTP) ligands in solution. In order to study molecules in solution using theoretical methods, force-field parameters for the ligand-water interaction are adjusted to ab initio quantum chemical calculations. Based on these parameters, molecular dynamics (MD) simulations are carried out from which snapshots are extracted as input to quantum chemical excitation-energy calculations to obtain UV/vis spectra of BTP ligands in solution using time-dependent density functional theory (TDDFT) employing the Tamm-Dancoff approximation (TDA). The range-separated CAM-B3LYP functional is used to avoid large errors for charge-transfer states occurring in the electronic spectra. In order to study environment effects with theoretical methods, the frozen-density embedding scheme is applied. This computational procedure allows to obtain electronic spectra calculated at the (range-separated) DFT level of theory in solution, revealing solvatochromic shifts upon solvation of up to about 0.6 eV. Comparison to experimental data shows a significantly improved agreement compared to vacuum calculations and enables the analysis of relevant excitations for the line shape in solution.

  12. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  13. Effectiveness of Computer-Assisted Mathematics Education (CAME) over Academic Achievement: A Meta-Analysis Study

    Science.gov (United States)

    Demir, Seda; Basol, Gülsah

    2014-01-01

    The aim of the current study is to determine the overall effects of Computer-Assisted Mathematics Education (CAME) on academic achievement. After an extensive review of the literature, studies using Turkish samples and observing the effects of Computer-Assisted Education (CAE) on mathematics achievement were examined. As a result of this…

  14. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  15. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  16. Computationally determining the salience of decision points for real-time wayfinding support

    Directory of Open Access Journals (Sweden)

    Makoto Takemiya

    2012-06-01

    Full Text Available This study introduces the concept of computational salience to explain the discriminatory efficacy of decision points, which in turn may have applications to providing real-time assistance to users of navigational aids. This research compared algorithms for calculating the computational salience of decision points and validated the results via three methods: high-salience decision points were used to classify wayfinders; salience scores were used to weight a conditional probabilistic scoring function for real-time wayfinder performance classification; and salience scores were correlated with wayfinding-performance metrics. As an exploratory step to linking computational and cognitive salience, a photograph-recognition experiment was conducted. Results reveal a distinction between algorithms useful for determining computational and cognitive saliences. For computational salience, information about the structural integration of decision points is effective, while information about the probability of decision-point traversal shows promise for determining cognitive salience. Limitations from only using structural information and motivations for future work that include non-structural information are elicited.

  17. A revealed-preference study of behavioural impacts of real-time traffic information

    NARCIS (Netherlands)

    Knockaert, J.S.A.; Tseng, Y.; Verhoef, E.T.

    2013-01-01

    In the present study, we investigate the impact of real-time traffic information on traveller behaviour by using repeated day-to-day revealed-preference (RP) observations from a reward experiment. We estimate a trip scheduling model of morning peak behaviour that allows us to determine the impact of

  18. Muoniated radical states in the group 16 elements: Computational studies

    International Nuclear Information System (INIS)

    Macrae, Roderick M.

    2009-01-01

    Recent experimental studies on positive muon implantation in silicon, selenium, and tellurium have been interpreted on the basis that the primary paramagnetic species observed is XMu (X=S, Se, or Te), the muonium-substituted analog of the appropriate diatomic chalcogen monohydride radical. However, temperature-dependent signal visibility, broadening, and hyperfine shift effects remain puzzling. The interplay of degeneracy, spin-orbit coupling, and vibrational averaging in these species makes them computationally challenging despite their small size. In this work computational studies are carried out on all hydrogen isotopomers of the series OH, SH, SeH, and TeH. Several different methodological approaches are compared, and the effects of wavefunction symmetry, spin-orbit coupling, and zero-point vibrational corrections on the isotropic and anisotropic components of the hyperfine interaction are examined. Additionally, some models of the Mu site in rhombic sulfur are briefly considered.

  19. An Exploratory Study of the Implementation of Computer Technology in an American Islamic Private School

    Science.gov (United States)

    Saleem, Mohammed M.

    2009-01-01

    This exploratory study of the implementation of computer technology in an American Islamic private school leveraged the case study methodology and ethnographic methods informed by symbolic interactionism and the framework of the Muslim Diaspora. The study focused on describing the implementation of computer technology and identifying the…

  20. Experience of computed tomographic myelography and discography in cervical problem

    Energy Technology Data Exchange (ETDEWEB)

    Nakatani, Shigeru; Yamamoto, Masayuki; Uratsuji, Masaaki; Suzuki, Kunio; Matsui, Eigo [Hyogo Prefectural Awaji Hospital, Sumoto, Hyogo (Japan); Kurihara, Akira

    1983-06-01

    CTM (computed tomographic myelography) was performed on 15 cases of cervical lesions, and on 5 of them, CTD (computed tomographic discography) was also made. CTM revealed the intervertebral state, and in combination with CTD, providing more accurate information. The combined method of CTM and CTD was useful for soft disc herniation.

  1. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  2. A study of visual and musculoskeletal health disorders among computer professionals in NCR Delhi

    Directory of Open Access Journals (Sweden)

    Talwar Richa

    2009-01-01

    Full Text Available Objective: To study the prevalence of health disorders among computer professionals and its association with working environment conditions. Study design: Cross sectional. Materials and Methods: A sample size of 200 computer professionals, from Delhi and NCR which included software developers, call centre workers, and data entry workers. Result: The prevalence of visual problems in the study group was 76% (152/200, and musculoskeletal problems were reported by 76.5% (153/200. It was found that there was a gradual increase in visual complaints as the number of hours spent for working on computers daily increased and the same relation was found to be true for musculoskeletal problems as well. Visual problems were less in persons using antiglare screen, and those with adequate lighting in the room. Musculoskeletal problems were found to be significantly lesser among those using cushioned chairs and soft keypad. Conclusion: A significant proportion of the computer professionals were found to be having health problems and this denotes that the occupational health of the people working in the computer field needs to be emphasized as a field of concern in occupational health.

  3. A morphological study of the mandibular molar region using reconstructed helical computed tomographic images

    International Nuclear Information System (INIS)

    Tsuno, Hiroaki; Noguchi, Makoto; Noguchi, Akira; Yoshida, Keiko; Tachinami, Yasuharu

    2010-01-01

    This study investigated the morphological variance in the mandibular molar region using reconstructed helical computed tomographic (CT) images. In addition, we discuss the necessity of CT scanning as part of the preoperative assessment process for dental implantation, by comparing the results with the findings of panoramic radiography. Sixty patients examined using CT as part of the preoperative assessment for dental implantation were analyzed. Reconstructed CT images were used to evaluate the bone quality and cross-sectional bone morphology of the mandibular molar region. The mandibular cortical index (MCI) and X-ray density ratio of this region were assessed using panoramic radiography in order to analyze the correlation between the findings of the CT images and panoramic radiography. CT images showed that there was a decrease in bone quality in cases with high MCI. Cross-sectional CT images revealed that the undercuts on the lingual side in the highly radiolucent areas in the basal portion were more frequent than those in the alveolar portion. This study showed that three-dimensional reconstructed CT images can help to detect variances in mandibular morphology that might be missed by panoramic radiography. In conclusion, it is suggested that CT should be included as an important examination tool before dental implantation. (author)

  4. Implementation of PHENIX trigger algorithms on massively parallel computers

    International Nuclear Information System (INIS)

    Petridis, A.N.; Wohn, F.K.

    1995-01-01

    The event selection requirements of contemporary high energy and nuclear physics experiments are met by the introduction of on-line trigger algorithms which identify potentially interesting events and reduce the data acquisition rate to levels that are manageable by the electronics. Such algorithms being parallel in nature can be simulated off-line using massively parallel computers. The PHENIX experiment intends to investigate the possible existence of a new phase of matter called the quark gluon plasma which has been theorized to have existed in very early stages of the evolution of the universe by studying collisions of heavy nuclei at ultra-relativistic energies. Such interactions can also reveal important information regarding the structure of the nucleus and mandate a thorough investigation of the simpler proton-nucleus collisions at the same energies. The complexity of PHENIX events and the need to analyze and also simulate them at rates similar to the data collection ones imposes enormous computation demands. This work is a first effort to implement PHENIX trigger algorithms on parallel computers and to study the feasibility of using such machines to run the complex programs necessary for the simulation of the PHENIX detector response. Fine and coarse grain approaches have been studied and evaluated. Depending on the application the performance of a massively parallel computer can be much better or much worse than that of a serial workstation. A comparison between single instruction and multiple instruction computers is also made and possible applications of the single instruction machines to high energy and nuclear physics experiments are outlined. copyright 1995 American Institute of Physics

  5. Educational Computer Use in Leisure Contexts: A Phenomenological Study of Adolescents' Experiences at Internet Cafes

    Science.gov (United States)

    Cilesiz, Sebnem

    2009-01-01

    Computer use is a widespread leisure activity for adolescents. Leisure contexts, such as Internet cafes, constitute specific social environments for computer use and may hold significant educational potential. This article reports a phenomenological study of adolescents' experiences of educational computer use at Internet cafes in Turkey. The…

  6. Computation material science of structural-phase transformation in casting aluminium alloys

    Science.gov (United States)

    Golod, V. M.; Dobosh, L. Yu

    2017-04-01

    Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.

  7. Defining Effectiveness Using Finite Sets A Study on Computability

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Haeusler, Edward H.; Garcia, Alex

    2016-01-01

    finite sets and uses category theory as its mathematical foundations. The model relies on the fact that every function between finite sets is computable, and that the finite composition of such functions is also computable. Our approach is an alternative to the traditional model-theoretical based works...... which rely on (ZFC) set theory as a mathematical foundation, and our approach is also novel when compared to the already existing works using category theory to approach computability results. Moreover, we show how to encode Turing machine computations in the model, thus concluding the model expresses...

  8. Longitudinal Study of Factors Impacting the Implementation of Notebook Computer Based CAD Instruction

    Science.gov (United States)

    Goosen, Richard F.

    2009-01-01

    This study provides information for higher education leaders that have or are considering conducting Computer Aided Design (CAD) instruction using student owned notebook computers. Survey data were collected during the first 8 years of a pilot program requiring engineering technology students at a four year public university to acquire a notebook…

  9. Serendipity? Are There Gender Differences in the Adoption of Computers? A Case Study.

    Science.gov (United States)

    Vernon-Gerstenfeld, Susan

    1989-01-01

    Discusses a study about the effect of learning styles of patent examiners on adoption of computers. Subjects' amount of computer use was a function of learning style, age, comfort after training, and gender. Findings indicate that women showed a greater propensity to adopt than men. Discusses implications for further research. (JS)

  10. Costs of cloud computing for a biometry department. A case study.

    Science.gov (United States)

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  11. Computing requirements for S.S.C. accelerator design and studies

    International Nuclear Information System (INIS)

    Dragt, A.; Talman, R.; Siemann, R.; Dell, G.F.; Leemann, B.; Leemann, C.; Nauenberg, U.; Peggs, S.; Douglas, D.

    1984-01-01

    We estimate the computational hardware resources that will be required for accelerator physics studies during the design of the Superconducting SuperCollider. It is found that both Class IV and Class VI facilities (1) will be necessary. We describe a user environment for these facilities that is desirable within the context of accelerator studies. An acquisition scenario for these facilities is presented

  12. A computer-based time study system for timber harvesting operations

    Science.gov (United States)

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  13. Study of tip loss corrections using CFD rotor computations

    DEFF Research Database (Denmark)

    Shen, Wen Zhong; Zhu, Wei Jun; Sørensen, Jens Nørkær

    2014-01-01

    Tip loss correction is known to play an important role for engineering prediction of wind turbine performance. There are two different types of tip loss corrections: tip corrections on momentum theory and tip corrections on airfoil data. In this paper, we study the latter using detailed CFD...... computations for wind turbines with sharp tip. Using the technique of determination of angle of attack and the CFD results for a NordTank 500 kW rotor, airfoil data are extracted and a new tip loss function on airfoil data is derived. To validate, BEM computations with the new tip loss function are carried out...... and compared with CFD results for the NordTank 500 kW turbine and the NREL 5 MW turbine. Comparisons show that BEM with the new tip loss function can predict correctly the loading near the blade tip....

  14. Comparative study of scintigraphy, ultrasonography and computed tomography in the evaluation of liver tumours

    International Nuclear Information System (INIS)

    Tohyama, Junko; Ishigaki, Takeo; Ishikawa, Tsutomu

    1982-01-01

    A comparative study of scintigraphy, ultrasonography and computed tomography in 67 proven patients with clinically suspected liver tumours was reported. Scintigraphy was superior in sensitivity to ultrasonography and computed tomography. However, in specificity, scintigraphy was inferior to other two. Diagnostic efficacy of ultrasonography and computed tomography in detecting focal masses of the liver was not greatly different, and simultaneous interpretation of ultrasonogram and computed tomogram was more helpful than independent interpretation. So they were thought to be complementary. In conclusion, scintigraphy was thought to be the initial procedure in the diagnostic approach for focal liver masses and ultrasonography was second procedure because of no radiation hazards. And computed tomography should follow then. (author)

  15. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  16. Persistent Neck and Shoulder Pains among Computer Office Workers: A Longitudinal Study

    Directory of Open Access Journals (Sweden)

    Farideh Sadeghian

    2012-11-01

    Full Text Available Please cite this article as: Sadeghian F, Raei M, Amiri M. Persistent Neck and Shoulder Pains among Computer Office Workers: A Longitudinal Study. Arch Hyg Sci 2012;1(2:33-40. Background & Aims of the Study: In developing countries, with increasing use of computer systems, millions of computer workers are at high risk of neck and shoulder pains. The aim of this study was to assess the relationships between work-related physical and psychosocial factors and persistent neck and shoulder pains among computer office workers. Materials & Methods : This longitudinal study with 1-year follow-up was conducted among all eligible computer office workers (n=182 of Shahroud universities (northeastern Iran in 2009-2010. “Cultural and Psychosocial Influences on Disability (CUPID” questionnaire was used to collect data on demographic characteristics, physical, organizational and psychosocial factors at work, and neck and shoulder symptoms. Chi square and logistic regression analysis was used to analyze the data through SPSS version 16. Results: Computer office workers with the mean±SD age of 32.1±6.7 years and the mean±SD weekly work hours of 47.4±8.2 participated in this study. At the baseline 39.6% of workers reported neck and shoulder pains. At one year follow-up, 59.7% of them reported neck pain and 51.3% reported shoulder pain. Significant relationships were found between persistence of neck and shoulder pains and age, gender, and decision latitude at work. Conclusions: Although neck and shoulder pains were equally prevalent among the study group, after one year follow up, persistent neck pain was more than shoulder pain. Age, gender, and decision latitude at work were identified as risk factors for both pains. References: 1. Buckle PW, Devereux JJ. The nature of work-related neck and upper limb musculoskeletal disorders. Appl Ergon 2002;33(3:207–17. 2. Tinubu BMS, Mbada CE, Oyeyemi AL, Fabunmi AA. Work-Related Musculoskeletal Disorders among

  17. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    OpenAIRE

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods Th...

  18. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  19. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  20. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  1. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    International Nuclear Information System (INIS)

    Kim, Jae Duck; Kim, Seung Kug

    1990-01-01

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  2. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Duck; Kim, Seung Kug [Dept. of Oral Radiology, College of Dentistry, Chosun University, Kwangju (Korea, Republic of)

    1990-08-15

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  3. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Bae, Jun Ho; Park, Joo Hwan

    2010-01-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect the detailed shape of rod bundle on the numerical computation due to a lot of computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers, bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve the complex geometry such as a fuel rod bundle. In front of applying the method to the problem of 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to the simple geometry. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for the future works

  4. High-Throughput Computing on High-Performance Platforms: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Matteo, Turilli [Rutgers University; Angius, Alessio [Rutgers University; Oral, H Sarp [ORNL; De, K [University of Texas at Arlington; Klimentov, A [Brookhaven National Laboratory (BNL); Wells, Jack C. [ORNL; Jha, S [Rutgers University

    2017-10-01

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i) a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.

  5. A computational study of the supersonic coherent jet

    International Nuclear Information System (INIS)

    Jeong, Mi Seon; Kim, Heuy Dong

    2003-01-01

    In steel-making process of iron and steel industry, the purity and quality of steel can be dependent on the amount of CO contained in the molten metal. Recently, the supersonic oxygen jet is being applied to the molten metal in the electric furnace and thus reduces the CO amount through the chemical reactions between the oxygen jet and molten metal, leading to a better quality of steel. In this application, the supersonic oxygen jet is limited in the distance over which the supersonic velocity is maintained. In order to get longer supersonic jet propagation into the molten metal, a supersonic coherent jet is suggested as one of the alternatives which are applicable to the electric furnace system. It has a flame around the conventional supersonic jet and thus the entrainment effect of the surrounding gas into the supersonic jet is reduced, leading to a longer propagation of the supersonic jet. In this regard, gasdynamics mechanism about why the combustion phenomenon surrounding the supersonic jet causes the jet core length to be longer is not yet clarified. The present study investigates the major characteristics of the supersonic coherent jet, compared with the conventional supersonic jet. A computational study is carried out to solve the compressible, axisymmetric Navier-Stokes equations. The computational results of the supersonic coherent jet are compared with the conventional supersonic jets

  6. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    Science.gov (United States)

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  7. Individual versus Organizational Computer Security and Privacy Concerns in Journalism

    Directory of Open Access Journals (Sweden)

    McGregor Susan E.

    2016-10-01

    Full Text Available A free and open press is a critical piece of the civil-society infrastructure that supports both established and emerging democracies. However, as the professional activities of reporting and publishing are increasingly conducted by digital means, computer security and privacy risks threaten free and independent journalism around the globe. Through interviews with 15 practicing journalists and 14 organizational stakeholders (supervising editors and technologists, we reveal the distinct - and sometimes conflicting-computer security concerns and priorities of different stakeholder groups within journalistic institutions, as well as unique issues in journalism compared to other types of organizations. As these concerns have not been deeply studied by those designing computer security practices or technologies that may benefit journalism, this research offers insight into some of the practical and cultural constraints that can limit the computer security and privacy practices of the journalism community as a whole. Based on these findings, we suggest paths for future research and development that can bridge these gaps through new tools and practices.

  8. Real-time data-intensive computing

    Energy Technology Data Exchange (ETDEWEB)

    Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander; MacDowell, Alastair A.; Padmore, Howard A.; Shapiro, David; Tamura, Nobumichi [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Beattie, Keith; Krishnan, Harinarayan; Patton, Simon J.; Perciano, Talita; Stromsness, Rune; Tull, Craig E.; Ushizima, Daniela [Computational Research Division, Lawrence Berkeley National Laboratory Berkeley CA 94720 (United States); Correa, Joaquin; Deslippe, Jack R. [National Energy Research Scientific Computing Center, Berkeley, CA 94720 (United States); Dart, Eli; Tierney, Brian L. [Energy Sciences Network, Berkeley, CA 94720 (United States); Daurer, Benedikt J.; Maia, Filipe R. N. C. [Uppsala University, Uppsala (Sweden); and others

    2016-07-27

    Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficient closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.

  9. Studies of electron collisions with polyatomic molecules using distributed-memory parallel computers

    International Nuclear Information System (INIS)

    Winstead, C.; Hipes, P.G.; Lima, M.A.P.; McKoy, V.

    1991-01-01

    Elastic electron scattering cross sections from 5--30 eV are reported for the molecules C 2 H 4 , C 2 H 6 , C 3 H 8 , Si 2 H 6 , and GeH 4 , obtained using an implementation of the Schwinger multichannel method for distributed-memory parallel computer architectures. These results, obtained within the static-exchange approximation, are in generally good agreement with the available experimental data. These calculations demonstrate the potential of highly parallel computation in the study of collisions between low-energy electrons and polyatomic gases. The computational methodology discussed is also directly applicable to the calculation of elastic cross sections at higher levels of approximation (target polarization) and of electronic excitation cross sections

  10. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  11. A Computational Study on the Relation between Resting Heart Rate and Atrial Fibrillation Hemodynamics under Exercise.

    Directory of Open Access Journals (Sweden)

    Matteo Anselmino

    Full Text Available Clinical data indicating a heart rate (HR target during rate control therapy for permanent atrial fibrillation (AF and assessing its eventual relationship with reduced exercise tolerance are lacking. The present study aims at investigating the impact of resting HR on the hemodynamic response to exercise in permanent AF patients by means of a computational cardiovascular model.The AF lumped-parameter model was run to simulate resting (1 Metabolic Equivalent of Task-MET and various exercise conditions (4 METs: brisk walking; 6 METs: skiing; 8 METs: running, considering different resting HR (70 bpm for the slower resting HR-SHR-simulations, and 100 bpm for the higher resting HR-HHR-simulations. To compare relative variations of cardiovascular variables upon exertion, the variation comparative index (VCI-the absolute variation between the exercise and the resting values in SHR simulations referred to the absolute variation in HHR simulations-was calculated at each exercise grade (VCI4, VCI6 and VCI8.Pulmonary venous pressure underwent a greater increase in HHR compared to SHR simulations (VCI4 = 0.71, VCI6 = 0.73 and VCI8 = 0.77, while for systemic arterial pressure the opposite is true (VCI4 = 1.15, VCI6 = 1.36, VCI8 = 1.56.The computational findings suggest that a slower, with respect to a higher resting HR, might be preferable in permanent AF patients, since during exercise pulmonary venous pressure undergoes a slighter increase and systemic blood pressure reveals a more appropriate increase.

  12. A Computational Study on the Relation between Resting Heart Rate and Atrial Fibrillation Hemodynamics under Exercise.

    Science.gov (United States)

    Anselmino, Matteo; Scarsoglio, Stefania; Saglietto, Andrea; Gaita, Fiorenzo; Ridolfi, Luca

    2017-01-01

    Clinical data indicating a heart rate (HR) target during rate control therapy for permanent atrial fibrillation (AF) and assessing its eventual relationship with reduced exercise tolerance are lacking. The present study aims at investigating the impact of resting HR on the hemodynamic response to exercise in permanent AF patients by means of a computational cardiovascular model. The AF lumped-parameter model was run to simulate resting (1 Metabolic Equivalent of Task-MET) and various exercise conditions (4 METs: brisk walking; 6 METs: skiing; 8 METs: running), considering different resting HR (70 bpm for the slower resting HR-SHR-simulations, and 100 bpm for the higher resting HR-HHR-simulations). To compare relative variations of cardiovascular variables upon exertion, the variation comparative index (VCI)-the absolute variation between the exercise and the resting values in SHR simulations referred to the absolute variation in HHR simulations-was calculated at each exercise grade (VCI4, VCI6 and VCI8). Pulmonary venous pressure underwent a greater increase in HHR compared to SHR simulations (VCI4 = 0.71, VCI6 = 0.73 and VCI8 = 0.77), while for systemic arterial pressure the opposite is true (VCI4 = 1.15, VCI6 = 1.36, VCI8 = 1.56). The computational findings suggest that a slower, with respect to a higher resting HR, might be preferable in permanent AF patients, since during exercise pulmonary venous pressure undergoes a slighter increase and systemic blood pressure reveals a more appropriate increase.

  13. Gender and stereotypes in motivation to study computer programming for careers in multimedia

    Science.gov (United States)

    Doubé, Wendy; Lang, Catherine

    2012-03-01

    A multimedia university programme with relatively equal numbers of male and female students in elective programming subjects provided a rare opportunity to investigate female motivation to study and pursue computer programming in a career. The MSLQ was used to survey 85 participants. In common with research into deterrence of females from STEM domains, females displayed significantly lower self-efficacy and expectancy for success. In contrast to research into deterrence of females from STEM domains, both genders placed similar high values on computer programming and shared high extrinsic and intrinsic goal orientation. The authors propose that the stereotype associated with a creative multimedia career could attract female participation in computer programming whereas the stereotype associated with computer science could be a deterrent.

  14. Teaching programming to non-STEM novices: a didactical study of computational thinking and non-STEM computing education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid

    research approach. Computational thinking plays a significant role in computing education but it is still unclear how it should be interpreted to best serve its purpose. Constructionism and Computational Making seems to be promising frameworks to do this. In regards to specific teaching activities...

  15. Robotics as an integration subject in the computer science university studies. The experience of the University of Almeria

    Directory of Open Access Journals (Sweden)

    Manuela Berenguel Soria

    2012-11-01

    Full Text Available This work presents a global view of the role of robotics in computer science studies, mainly in university degrees. The main motivation of the use of robotics in these studies deals with the following issues: robotics permits to put in practice many computer science fundamental topics, it is a multidisciplinary area which allows to complete the basic knowledge of any computer science student, it facilitates the practice and learning of basic competences of any engineer (for instance, teamwork, and there is a wide market looking for people with robotics knowledge. These ideas are discussed from our own experience in the University of Almeria acquired through the studies of Computer Science Technical Engineering, Computer Science Engineering, Computer Science Degree and Computer Science Postgraduate.

  16. Preverbal and verbal counting and computation.

    Science.gov (United States)

    Gallistel, C R; Gelman, R

    1992-08-01

    We describe the preverbal system of counting and arithmetic reasoning revealed by experiments on numerical representations in animals. In this system, numerosities are represented by magnitudes, which are rapidly but inaccurately generated by the Meck and Church (1983) preverbal counting mechanism. We suggest the following. (1) The preverbal counting mechanism is the source of the implicit principles that guide the acquisition of verbal counting. (2) The preverbal system of arithmetic computation provides the framework for the assimilation of the verbal system. (3) Learning to count involves, in part, learning a mapping from the preverbal numerical magnitudes to the verbal and written number symbols and the inverse mappings from these symbols to the preverbal magnitudes. (4) Subitizing is the use of the preverbal counting process and the mapping from the resulting magnitudes to number words in order to generate rapidly the number words for small numerosities. (5) The retrieval of the number facts, which plays a central role in verbal computation, is mediated via the inverse mappings from verbal and written numbers to the preverbal magnitudes and the use of these magnitudes to find the appropriate cells in tabular arrangements of the answers. (6) This model of the fact retrieval process accounts for the salient features of the reaction time differences and error patterns revealed by experiments on mental arithmetic. (7) The application of verbal and written computational algorithms goes on in parallel with, and is to some extent guided by, preverbal computations, both in the child and in the adult.

  17. Computational Studies of Snake Venom Toxins.

    Science.gov (United States)

    Ojeda, Paola G; Ramírez, David; Alzate-Morales, Jans; Caballero, Julio; Kaas, Quentin; González, Wendy

    2017-12-22

    Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin.

  18. Computational Studies of Snake Venom Toxins

    Directory of Open Access Journals (Sweden)

    Paola G. Ojeda

    2017-12-01

    Full Text Available Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin.

  19. Revealing Soil Structure and Functional Macroporosity along a Clay Gradient Using X-ray Computed Tomography

    DEFF Research Database (Denmark)

    Naveed, Muhammad; Møldrup, Per; Arthur, Emmanuel

    2013-01-01

    clay content, respectively) at a field site in Lerbjerg, Denmark. The water-holding capacity of soils markedly increased with increasing soil clay content, while significantly higher air permeability was observed for the L1 to L3 soils than for the L4 to L6 soils. Higher air permeability values......The influence of clay content in soil-pore structure development and the relative importance of macroporosity in governing convective fluid flow are two key challenges toward better understanding and quantifying soil ecosystem functions. In this study, soil physical measurements (soil-water...... retention and air permeability) and x-ray computed tomography (CT) scanning were combined and used from two scales on intact soil columns (100 and 580 cm3). The columns were sampled along a natural clay gradient at six locations (L1, L2, L3, L4, L5 and L6 with 0.11, 0.16, 0.21, 0.32, 0.38 and 0.46 kg kg−1...

  20. Computational study of scattering of a zero-order Bessel beam by large nonspherical homogeneous particles with the multilevel fast multipole algorithm

    Science.gov (United States)

    Yang, Minglin; Wu, Yueqian; Sheng, Xinqing; Ren, Kuan Fang

    2017-12-01

    Computation of scattering of shaped beams by large nonspherical particles is a challenge in both optics and electromagnetics domains since it concerns many research fields. In this paper, we report our new progress in the numerical computation of the scattering diagrams. Our algorithm permits to calculate the scattering of a particle of size as large as 110 wavelengths or 700 in size parameter. The particle can be transparent or absorbing of arbitrary shape, smooth or with a sharp surface, such as the Chebyshev particles or ice crystals. To illustrate the capacity of the algorithm, a zero order Bessel beam is taken as the incident beam, and the scattering of ellipsoidal particles and Chebyshev particles are taken as examples. Some special phenomena have been revealed and examined. The scattering problem is formulated with the combined tangential formulation and solved iteratively with the aid of the multilevel fast multipole algorithm, which is well parallelized with the message passing interface on the distributed memory computer platform using the hybrid partitioning strategy. The numerical predictions are compared with the results of the rigorous method for a spherical particle to validate the accuracy of the approach. The scattering diagrams of large ellipsoidal particles with various parameters are examined. The effect of aspect ratios, as well as half-cone angle of the incident zero-order Bessel beam and the off-axis distance on scattered intensity, is studied. Scattering by asymmetry Chebyshev particle with size parameter larger than 700 is also given to show the capability of the method for computing scattering by arbitrary shaped particles.

  1. Study of space--charge effect by computer

    International Nuclear Information System (INIS)

    Sasaki, T.

    1982-01-01

    The space--charge effect in high density electron beams (beam current approx.2 μA) focused by a uniform magnetic field is studied computationally. On an approximation of averaged space-- charge force, a theory of trajectory displacements of beam electrons is developed. The theory shows that the effect of the averaged space--charge force appears as a focal length stretch. The theory is confirmed not only qualitatively but also quantitatively by simulations. Empirical formulas for the trajectory displacement and the energy spread are presented. A comparison between the empirical formulas and some theoretical formulas is made, leading to a severe criticism on the theories of energy spreads

  2. A computer simulation of auger electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Ragheb, M S; Bakr, M H.S. [Dept. Of Accellerators and Ion Sources, Division of Basic Nuclear Sciences, NRC, Atomic Energy Authority, (Egypt)

    1997-12-31

    A simulation study of Auger electron spectroscopy was performed to reveal how far the dependency between the different parameters governing the experimental behavior affects the peaks. The experimental procedure followed by the AC modulation technique were reproduced by means of a computer program. It generates the assumed output Auger electron peaks, exposes them to a retarding AC modulated field and collects the resulting modulated signals. The program produces the lock-in treatment in order to demodulate the signals revealing the Auger peaks. It analyzes the spectrum obtained giving the peak positions and energies. Comparison between results of simulation and the experimental data showed good agreement. The peaks of the spectrum obtained depend upon the amplitude, frequency and resolution of the applied modulated signal. The peak shape is effected by the rise time, the slope and the starting potential of the retarding field. 4 figs.

  3. Case Studies of Liberal Arts Computer Science Programs

    Science.gov (United States)

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  4. Thermodynamic study of 2-aminothiazole and 2-aminobenzothiazole: Experimental and computational approaches

    International Nuclear Information System (INIS)

    Silva, Ana L.R.; Monte, Manuel J.S.; Morais, Victor M.F.; Ribeiro da Silva, Maria D.M.C.

    2014-01-01

    Highlights: • Combustion of 2-aminothiazole and 2-aminobenzothiazole by rotating bomb calorimetry. • Enthalpies of sublimation of 2-aminothiazole and 2-aminobenzothiazole. • Gaseous enthalpies of formation of 2-aminothiazole and 2-aminobenzothiazole. • Gaseous enthalpies of formation calculated from high-level MO calculations. • Gas-phase enthalpies of formation estimated from G3(MP2)//B3LYP approach. - Abstract: This work reports an experimental and computational thermochemical study of two aminothiazole derivatives, namely 2-aminothiazole and 2-aminobenzothiazole. The standard (p° = 0.1 MPa) molar energies of combustion of these compounds were measured by rotating bomb combustion calorimetry. The standard molar enthalpies of sublimation, at T = 298.15 K, were derived from the temperature dependence of the vapor pressures of these compounds, measured by the Knudsen-effusion technique and from high temperature Calvet microcalorimetry. The conjugation of these experimental results enabled the calculation of the standard molar enthalpies of formation in the gaseous state, at T = 298.15 K, for the compounds studied. The corresponding standard Gibbs free energies of formation in crystalline and gaseous phases were also derived, allowing the analysis of their stability, in these phases. We have also estimated the gas-phase enthalpies of formation from high-level molecular orbital calculations at the G3(MP2)//B3LYP level of theory, the estimates revealing very good agreement with the experimental ones. The importance of some stabilizing electronic interactions occurring in the title molecules has been studied and quantitatively evaluated through Natural Bonding Orbital (NBO) of the corresponding wavefunctions and their Nucleus Independent Chemical Shifts (NICS) parameters have been calculated in order to rationalize the effect of electronic delocalization upon stability

  5. Parallel computing simulation of fluid flow in the unsaturated zone of Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Bodvarsson, G.S.

    2001-01-01

    This paper presents the application of parallel computing techniques to large-scale modeling of fluid flow in the unsaturated zone (UZ) at Yucca Mountain, Nevada. In this study, parallel computing techniques, as implemented into the TOUGH2 code, are applied in large-scale numerical simulations on a distributed-memory parallel computer. The modeling study has been conducted using an over-one-million-cell three-dimensional numerical model, which incorporates a wide variety of field data for the highly heterogeneous fractured formation at Yucca Mountain. The objective of this study is to analyze the impact of various surface infiltration scenarios (under current and possible future climates) on flow through the UZ system, using various hydrogeological conceptual models with refined grids. The results indicate that the one-million-cell models produce better resolution results and reveal some flow patterns that cannot be obtained using coarse-grid modeling models

  6. Computed tomography in hepatic echinococcosis

    International Nuclear Information System (INIS)

    Choliz, J.D.; Olaverri, F.J.L.; Casas, T.F.; Zubieta, S.O.

    1982-01-01

    Computed tomography (CT) was used to evaluate 50 cases of hydatid disease of the liver. It was definite in 49 cases and negative in one case. Pre- and postcontrast scans were performed. CT may reveal the exact location and extension of cysts and possible complications. However, a false-negative case was found in a hydatid cyst located in a fatty liver

  7. Simulation study on the operating characteristics of the heat pipe for combined evaporative cooling of computer room air-conditioning system

    International Nuclear Information System (INIS)

    Han, Zongwei; Zhang, Yanqing; Meng, Xin; Liu, Qiankun; Li, Weiliang; Han, Yu; Zhang, Yanhong

    2016-01-01

    In order to improve the energy efficiency of air conditioning systems in computer rooms, this paper proposed a new concept of integrating evaporative cooling air-conditioning system with heat pipes. Based on a computer room in Shenyang, China, a mathematical model was built to perform transient simulations of the new system. The annual dynamical performance of the new system was then compared with a typical conventional computer room air-conditioning system. The result showed that the new integrated air-conditioning system had better energy efficiency, i.e. 31.31% reduction in energy consumption and 29.49% increase in COP (coefficient of performance), due to the adoption of evaporative condenser and the separate type heat pipe technology. Further study also revealed that the incorporated heat pipes enabled a 36.88% of decrease in the operation duration of the vapor compressor, and a 53.86% of reduction for the activation times of the compressor, which could lead to a longer lifespan of the compressor. The new integrated evaporative cooling air-conditioning system was also tested in different climate regions. It showed that the energy saving of the new system was greatly affected by climate, and it had the best effect in cold and dry regions like Shenyang with up to 31.31% energy saving. In some warm and humid climate regions like Guangzhou, the energy saving could be achieved up to 13.66%. - Highlights: • A novel combined air-conditioning system of computer room is constructed. • The performance of the system and conventional system is simulated and compared. • The applicability of the system in different climate regions is investigated.

  8. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  9. Wearable computing: Will it make people prosocial?

    Science.gov (United States)

    Nasiopoulos, Eleni; Risko, Evan F; Foulsham, Tom; Kingstone, Alan

    2015-05-01

    We recently reported that people who wear an eye tracker modify their natural looking behaviour in a prosocial manner. This change in looking behaviour represents a potential concern for researchers who wish to use eye trackers to understand the functioning of human attention. On the other hand, it may offer a real boon to manufacturers and consumers of wearable computing (e.g., Google Glass), for if wearable computing causes people to behave in a prosocial manner, then the public's fear that people with wearable computing will invade their privacy is unfounded. Critically, both of these divergent implications are grounded on the assumption that the prosocial behavioural effect of wearing an eye tracker is sustained for a prolonged period of time. Our study reveals that on the very first wearing of an eye tracker, and in less than 10 min, the prosocial effect of an eye tracker is abolished, but by drawing attention back to the eye tracker, the implied presence effect is easily reactivated. This suggests that eye trackers induce a transient social presence effect, which is rendered dormant when attention is shifted away from the source of implied presence. This is good news for researchers who use eye trackers to measure attention and behaviour; and could be bad news for advocates of wearable computing in everyday life. © 2014 The British Psychological Society.

  10. Inhibition of thrombin by functionalized C60 nanoparticles revealed via in vitro assays and in silico studies.

    Science.gov (United States)

    Liu, Yanyan; Fu, Jianjie; Pan, Wenxiao; Xue, Qiao; Liu, Xian; Zhang, Aiqian

    2018-01-01

    The studies on the human toxicity of nanoparticles (NPs) are far behind the rapid development of engineered functionalized NPs. Fullerene has been widely used as drug carrier skeleton due to its reported low risk. However, different from other kinds of NPs, fullerene-based NPs (C 60 NPs) have been found to have an anticoagulation effect, although the potential target is still unknown. In the study, both experimental and computational methods were adopted to gain mechanistic insight into the modulation of thrombin activity by nine kinds of C 60 NPs with diverse surface chemistry properties. In vitro enzyme activity assays showed that all tested surface-modified C 60 NPs exhibited thrombin inhibition ability. Kinetic studies coupled with competitive testing using 3 known inhibitors indicated that six of the C 60 NPs, of greater hydrophobicity and hydrogen bond (HB) donor acidity or acceptor basicity, acted as competitive inhibitors of thrombin by directly interacting with the active site of thrombin. A simple quantitative nanostructure-activity relationship model relating the surface substituent properties to the inhibition potential was then established for the six competitive inhibitors. Molecular docking analysis revealed that the intermolecular HB interactions were important for the specific binding of C 60 NPs to the active site canyon, while the additional stability provided by the surface groups through van der Waals interaction also play a key role in the thrombin binding affinity of the NPs. Our results suggest that thrombin is a possible target of the surface-functionalized C 60 NPs relevant to their anticoagulation effect. Copyright © 2017. Published by Elsevier B.V.

  11. Using Robotics and Game Design to Enhance Children's Self-Efficacy, STEM Attitudes, and Computational Thinking Skills

    Science.gov (United States)

    Leonard, Jacqueline; Buss, Alan; Gamboa, Ruben; Mitchell, Monica; Fashola, Olatokunbo S.; Hubert, Tarcia; Almughyirah, Sultan

    2016-01-01

    This paper describes the findings of a pilot study that used robotics and game design to develop middle school students' computational thinking strategies. One hundred and twenty-four students engaged in LEGO® EV3 robotics and created games using Scalable Game Design software. The results of the study revealed students' pre-post self-efficacy…

  12. Computed tomography study of Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Arai, H; Kobayashi, K; Ikeda, Y; Nagao, Y; Ogihara, R; Kosaka, K

    1983-01-01

    Computed tomography (CT) was used to study cerebral atrophy in 18 patients with clinically diagnosed Alzheimer's disease of presenile type and in 14 healthy age-matched subjects as controls. Using the computerized planimetric method, Subarachnoid Space Volume Index and Ventricle Volume Index were calculated as the measure of cortical atrophy and ventricular dilatation respectively. From the results the following conclusions were drawn: 1. The cerebral atrophy in Alzheimer patients could be attributable to the disease processes rather than to physiological aging of the brain. 2. The degree of atrophy increases in parallel with the progress of the clinical stage, and the cortical atrophy is already apparent at an early stage, whereas the ventricular dilatation becomes pronounced at later stages. 3. CT could be one of the most useful clinical tests available for the diagnosis of Alzheimer's disease.

  13. Degeneration in dysplastic hips. A computer tomography study

    DEFF Research Database (Denmark)

    Jacobsen, Steffen; Rømer, Lone; Søballe, Kjeld

    2005-01-01

    BACKGROUND: Hip dysplasia is considered pre-osteoarthritic, causing degeneration in young individuals. OBJECTIVE: To determine the pattern of degenerative change in moderate to severely dysplastic hips in young patients. DESIGN AND PATIENTS: One hundred and ninety-three consecutively......-referred younger patients with hip pain believed to be caused by hip dysplasia constituted the study cohort. The average age was 35.5 years (range, 15-61 years). They were examined by close-cut transverse pelvic and knee computed tomography and antero-posterior radiographs (CT). We identified 197 hips...

  14. Computational Study of Stratified Combustion in an Optical Diesel Engine

    KAUST Repository

    Jaasim, Mohammed

    2017-03-28

    Full cycle simulations of KAUST optical diesel engine were conducted in order to provide insights into the details of fuel spray, mixing, and combustion characteristics at different start of injection (SOI) conditions. Although optical diagnostics provide valuable information, the high fidelity simulations with matched parametric conditions improve fundamental understanding of relevant physical and chemical processes by accessing additional observables such as the local mixture distribution, intermediate species concentrations, and detailed chemical reaction rates. Commercial software, CONVERGE™, was used as the main simulation tool, with the Reynolds averaged Navier-Stokes (RANS) turbulence model and the multi-zone (SAGE) combustion model to compute the chemical reaction terms. SOI is varied from late compression ignition (CI) to early partially premixed combustion (PPC) conditions. The simulation results revealed a stronger correlation between fuel injection timing and combustion phasing for late SOI conditions, whereas the combustion phasing starts to decouple from SOI for early SOI cases. The predictions are consistent with the experimental observations, in terms of the overall trends in combustion and emission characteristics, while the high fidelity simulations provided further insights into the effects of mixture stratifications resulting from different SOI conditions.

  15. A study on measurement of scattery ray of computed tomography

    International Nuclear Information System (INIS)

    Cho, Pyong Kon; Lee, Joon Hyup; Kim, Yoon Sik; Lee, Chang Yeop

    2003-01-01

    Computed tomographic equipment is essential for diagnosis by means of radiation. With passage of time and development of science computed tomographic was developed time and again and in future examination by means of this equipment is expected to increase. In this connection these authors measured rate of scatter ray generation at front of lead glass for patients within control room of computed tomographic equipment room and outside of entrance door for exit and entrance of patients and attempted to find out method for minimizing exposure to scatter ray. From November 2001 twenty five units of computed tomographic equipment which were already installed and operation by 13 general hospitals and university hospitals in Seoul were subjected to this study. As condition of photographing those recommended by manufacturer for measuring exposure to scatter ray was use. At the time objects used DALI CT Radiation Dose Test Phantom fot Head (φ 16 cm Plexglas) and Phantom for Stomache (φ 32 cm Plexglas) were used. For measurement of scatter ray Reader (Radiation Monitor Controller Model 2026) and G-M Survey were used to Survey Meter of Radical Corporation, model 20 x 5-1800, Electrometer/Ion Chamber, S/N 21740. Spots for measurement of scatter ray included front of lead glass for patients within control room of computed tomographic equipment room which is place where most of work by gradiographic personnel are carried out and is outside of entrance door for exit and entrance of patients and their guardians and at spot 100 cm off from isocenter at the time of scanning the object. Work environment within computed tomography room which was installed and under operation by each hospital showed considerable difference depending on circumstances of pertinent hospitals and status of scatter ray was as follows. 1) From isocenter of computed tomographic equipment to lead glass for patients within control room average distance was 377 cm. At that time scatter ray showed diverse

  16. A case study on support for students' thinking through computer-mediated communication.

    Science.gov (United States)

    Sannomiya, M; Kawaguchi, A

    2000-08-01

    This is a case study on support for thinking through computer-mediated communication. Two graduate students were supervised in their research using computer-mediated communication, which was asynchronous and written; the supervisor was not present. The students' reports pointed out there was more planning and editing and low interactivity in this approach relative to face-to-face communication. These attributes were confirmed by their supervisor's report. The students also suggested that the latter was effective in support of a production stage of thinking in research, while the former approach was effective in support of examination of thinking. For distance education to be successful, an appropriate combination of communication media must consider students' thinking stages. Finally, transient and permanent effects should be discriminated in computer-mediated communication.

  17. Computational study of duct and pipe flows using the method of pseudocompressibility

    Science.gov (United States)

    Williams, Robert W.

    1991-01-01

    A viscous, three-dimensional, incompressible, Navier-Stokes Computational Fluid Dynamics code employing pseudocompressibility is used for the prediction of laminar primary and secondary flows in two 90-degree bends of constant cross section. Under study are a square cross section duct bend with 2.3 radius ratio and a round cross section pipe bend with 2.8 radius ratio. Sensitivity of predicted primary and secondary flow to inlet boundary conditions, grid resolution, and code convergence is investigated. Contour and velocity versus spanwise coordinate plots comparing prediction to experimental data flow components are shown at several streamwise stations before, within, and after the duct and pipe bends. Discussion includes secondary flow physics, computational method, computational requirements, grid dependence, and convergence rates.

  18. Modeling an Excitable Biosynthetic Tissue with Inherent Variability for Paired Computational-Experimental Studies.

    Directory of Open Access Journals (Sweden)

    Tanmay A Gokhale

    2017-01-01

    Full Text Available To understand how excitable tissues give rise to arrhythmias, it is crucially necessary to understand the electrical dynamics of cells in the context of their environment. Multicellular monolayer cultures have proven useful for investigating arrhythmias and other conduction anomalies, and because of their relatively simple structure, these constructs lend themselves to paired computational studies that often help elucidate mechanisms of the observed behavior. However, tissue cultures of cardiomyocyte monolayers currently require the use of neonatal cells with ionic properties that change rapidly during development and have thus been poorly characterized and modeled to date. Recently, Kirkton and Bursac demonstrated the ability to create biosynthetic excitable tissues from genetically engineered and immortalized HEK293 cells with well-characterized electrical properties and the ability to propagate action potentials. In this study, we developed and validated a computational model of these excitable HEK293 cells (called "Ex293" cells using existing electrophysiological data and a genetic search algorithm. In order to reproduce not only the mean but also the variability of experimental observations, we examined what sources of variation were required in the computational model. Random cell-to-cell and inter-monolayer variation in both ionic conductances and tissue conductivity was necessary to explain the experimentally observed variability in action potential shape and macroscopic conduction, and the spatial organization of cell-to-cell conductance variation was found to not impact macroscopic behavior; the resulting model accurately reproduces both normal and drug-modified conduction behavior. The development of a computational Ex293 cell and tissue model provides a novel framework to perform paired computational-experimental studies to study normal and abnormal conduction in multidimensional excitable tissue, and the methodology of modeling

  19. Attitudes of Jordanian Undergraduate Students towards Using Computer Assisted Language Learning (CALL

    Directory of Open Access Journals (Sweden)

    Farah Jamal Abed Alrazeq Saeed

    2018-01-01

    Full Text Available The study aimed at investigating the attitudes of Jordanian undergraduate students towards using computer assisted -language learning (CALL and its effectiveness in the process of learning the English language.  In order to fulfill the study’s objective, the researchers used a questionnaire to collect data, followed-up with semi-structured interviews to investigate the students’ beliefs towards CALL. Twenty- one of Jordanian BA students majoring in English language and literature were selected according to simple random sampling. The results revealed positive attitudes towards CALL in facilitating the process of writing assignments, gaining information; making learning enjoyable; improving their creativity, productivity, academic achievement, critical thinking skills, and enhancing their knowledge about vocabulary grammar, and culture. Furthermore, they believed that computers can motivate them to learn English language and help them to communicate and interact with their teachers and colleagues. The researchers recommended conducting a research on the same topic, taking into consideration the variables of age, gender, experience in using computers, and computer skills.

  20. APPLICATIONS OF CLOUD COMPUTING SERVICES IN EDUCATION – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Tomasz Cieplak

    2014-11-01

    Full Text Available Applications of Cloud Computing in enterprises are very wide-ranging. In opposition, educational applications of Cloud Computing in Poland are someway limited. On the other hand, young people use services of Cloud Computing frequently. Utilization of Facebook, Google or other services in Poland by young people is almost the same as in Western Europe or in the USA. Taking into account those considerations, few years ago authors have started process of popularization and usage of Cloud Computing educational services in their professional work. This article briefly summarizes authors’ experience with selected and most popular Cloud Computing services.

  1. Non-infectious complications of continuous ambulatory peritoneal dialysis: evaluation with peritoneal computed tomography

    International Nuclear Information System (INIS)

    Camsari, T.; Celik, A.; Ozaksoy, D.; Salman, S.; Cavdar, C.; Sifil, A.

    1998-01-01

    The purpose of the study was to evaluate the non-infectious complications of continuous ambulatory peritoneal dialysis (CAPD) using peritoneal computed tomography (PCT). Twenty symptomatic patients were included in the study. Initially 2000 ml of dialysate fluid was infused into the peritoneal cavity and standard peritoneal computed cavity and standard peritoneal computed tomography (SPCT) serial scans with 10 mm thickness were performed from the mid-thoracic region to the genital organs. Afterwards, 100 ml of non-ionic contrast material containing 300 mg/ml iodine was injected through the catheter and was distributed homogeneously in the intra-abdominal dialysate fluid by changing the positions of the patients; after waiting for 2-4 h, the CT scan was repeated as peritoneal contrast computed tomography (PCCT). In patients (n = 20) both SPCT and PCCT revealed 90 % (n = 18) pathological findings. But PCCT showed 60 % (n = 12) additional pathological findings. We believe that PCT is beneficial for evaluation of non-infectious complications of CAPD. But PCCT is superior to SPCT in evaluating non-infectious complications encountered in patients on CAPD treatment. (author)

  2. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  3. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  4. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Directory of Open Access Journals (Sweden)

    Carolina Milesi

    2017-04-01

    Full Text Available While the underrepresentation of women in the fast-growing STEM field of computer science (CS has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  5. Body dynamics and hydrodynamics of swimming larvae: a computational study

    NARCIS (Netherlands)

    Li, G.; Müller, U.K.; Leeuwen, van J.L.; Liu, H.

    2012-01-01

    To understand the mechanics of fish swimming, we need to know the forces exerted by the fluid and how these forces affect the motion of the fish. To this end, we developed a 3-D computational approach that integrates hydrodynamics and body dynamics. This study quantifies the flow around a swimming

  6. Mechanical unfolding reveals stable 3-helix intermediates in talin and α-catenin.

    Directory of Open Access Journals (Sweden)

    Vasyl V Mykuliak

    2018-04-01

    Full Text Available Mechanical stability is a key feature in the regulation of structural scaffolding proteins and their functions. Despite the abundance of α-helical structures among the human proteome and their undisputed importance in health and disease, the fundamental principles of their behavior under mechanical load are poorly understood. Talin and α-catenin are two key molecules in focal adhesions and adherens junctions, respectively. In this study, we used a combination of atomistic steered molecular dynamics (SMD simulations, polyprotein engineering, and single-molecule atomic force microscopy (smAFM to investigate unfolding of these proteins. SMD simulations revealed that talin rod α-helix bundles as well as α-catenin α-helix domains unfold through stable 3-helix intermediates. While the 5-helix bundles were found to be mechanically stable, a second stable conformation corresponding to the 3-helix state was revealed. Mechanically weaker 4-helix bundles easily unfolded into a stable 3-helix conformation. The results of smAFM experiments were in agreement with the findings of the computational simulations. The disulfide clamp mutants, designed to protect the stable state, support the 3-helix intermediate model in both experimental and computational setups. As a result, multiple discrete unfolding intermediate states in the talin and α-catenin unfolding pathway were discovered. Better understanding of the mechanical unfolding mechanism of α-helix proteins is a key step towards comprehensive models describing the mechanoregulation of proteins.

  7. A comparative study of attenuation correction algorithms in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Murase, Kenya; Itoh, Hisao; Mogami, Hiroshi; Ishine, Masashiro; Kawamura, Masashi; Iio, Atsushi; Hamamoto, Ken

    1987-01-01

    A computer based simulation method was developed to assess the relative effectiveness and availability of various attenuation compensation algorithms in single photon emission computed tomography (SPECT). The effect of the nonuniformity of attenuation coefficient distribution in the body, the errors in determining a body contour and the statistical noise on reconstruction accuracy and the computation time in using the algorithms were studied. The algorithms were classified into three groups: precorrection, post correction and iterative correction methods. Furthermore, a hybrid method was devised by combining several methods. This study will be useful for understanding the characteristics limitations and strengths of the algorithms and searching for a practical correction method for photon attenuation in SPECT. (orig.)

  8. Self-study manual for introduction to computational fluid dynamics

    OpenAIRE

    Nabatov, Andrey

    2017-01-01

    Computational Fluid Dynamics (CFD) is the branch of Fluid Mechanics and Computational Physics that plays a decent role in modern Mechanical Engineering Design process due to such advantages as relatively low cost of simulation comparing with conduction of real experiment, an opportunity to easily correct the design of a prototype prior to manufacturing of the final product and a wide range of application: mixing, acoustics, cooling and aerodynamics. This makes CFD particularly and Computation...

  9. Calorimetric and computational studies for three nitroimidazole isomers

    International Nuclear Information System (INIS)

    Carvalho, Tânia M.T.; Amaral, Luísa M.P.F.; Morais, Victor M.F.; Ribeiro da Silva, Maria D.M.C.

    2017-01-01

    Highlights: • Energy of combustion of 4-nitroimidazole was measured by static bomb calorimetry. • Enthalpy of sublimation of 4-nitroimidazole was determined by Calvet microcalorimetry. • Gas-phase enthalpy of formation of 4-nitroimidazole derived from experimental measurements. • Gas-phase enthalpies of nitroimidazole isomers formation estimated from G3 calculations. - Abstract: In the present work, a combined experimental and computational thermochemical study of nitroimidazole isomers was carried out. The standard (p° = 0.1 MPa) molar enthalpy of combustion, in the crystalline phase, for 4-nitroimidazole was determined, at the temperature of 298.15 K, using a static bomb combustion calorimeter. Calvet microcalorimetry experiments were performed to measure its standard molar enthalpy of sublimation. The standard molar enthalpy of formation of 4-nitroimidazole, in the gaseous phase, at T = 298.15 K, (116.9 ± 2.9) kJ·mol −1 , has been derived from the corresponding standard molar enthalpy of formation in the crystalline phase and the standard molar enthalpy of sublimation. Computational studies for 4-nitroimidazole were performed to complement the experimental work. These were also extended to the 2- and 5-nitroimidazole isomers. The gas-phase enthalpies of formation were estimated from high level ab initio molecular orbital calculations, at the G3 level. Also investigated were the tautomeric equilibrium of 4(5)-nitroimidazole in the gaseous phase and it was concluded that the two tautomers are equally stable.

  10. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  11. Application of computational methods in genetic study of inflammatory bowel disease.

    Science.gov (United States)

    Li, Jin; Wei, Zhi; Hakonarson, Hakon

    2016-01-21

    Genetic factors play an important role in the etiology of inflammatory bowel disease (IBD). The launch of genome-wide association study (GWAS) represents a landmark in the genetic study of human complex disease. Concurrently, computational methods have undergone rapid development during the past a few years, which led to the identification of numerous disease susceptibility loci. IBD is one of the successful examples of GWAS and related analyses. A total of 163 genetic loci and multiple signaling pathways have been identified to be associated with IBD. Pleiotropic effects were found for many of these loci; and risk prediction models were built based on a broad spectrum of genetic variants. Important gene-gene, gene-environment interactions and key contributions of gut microbiome are being discovered. Here we will review the different types of analyses that have been applied to IBD genetic study, discuss the computational methods for each type of analysis, and summarize the discoveries made in IBD research with the application of these methods.

  12. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  13. Solvent-driven symmetry of self-assembled nanocrystal superlattices-A computational study

    KAUST Repository

    Kaushik, Ananth P.; Clancy, Paulette

    2012-01-01

    used solvents, toluene and hexane. System sizes in the 400,000-500,000-atom scale followed for nanoseconds are required for this computationally intensive study. The key questions addressed here concern the thermodynamic stability of the superlattice

  14. Thermochemistry of 6-propyl-2-thiouracil: An experimental and computational study

    Energy Technology Data Exchange (ETDEWEB)

    Szterner, Piotr; Galvão, Tiago L.P.; Amaral, Luísa M.P.F.; Ribeiro da Silva, Maria D.M.C., E-mail: mdsilva@fc.up.pt; Ribeiro da Silva, Manuel A.V.

    2014-07-01

    Highlights: • Thermochemistry of 6-propyl-2-thiouracil – experimental and computational study. • Vapor pressure study of the 6-propyl-2-thiouracil by Knudsen effusion technique. • Enthalpies of formation of 6-propyl-2-thiouracil by rotating combustion calorimetry. • Accurate computational calculations (G3 and G4 composite methods) were performed. - Abstract: The standard (p{sup o} = 0.1 MPa) molar enthalpy of formation of 6-propyl-2-thiouracil was derived from its standard molar energy of combustion, in oxygen, to yield CO{sub 2} (g), N{sub 2} (g) and H{sub 2}SO{sub 4}·115H{sub 2}O (l), at T = 298.15 K, measured by rotating bomb combustion calorimetry. The vapor pressures as function of temperature were measured by the Knudsen effusion technique and the standard molar enthalpy of sublimation, Δ{sub cr}{sup g}H{sub m}{sup o}, at T = 298.15 K, was derived by the Clausius–Clapeyron equation. These two thermodynamic parameters yielded the standard molar enthalpy of formation, in the gaseous phase, at T = 298.15 K: −(142.5 ± 1.9) kJ mol{sup −1}. This value was compared with estimates obtained from very accurate computational calculations using the G3 and G4 composite methods.

  15. Saudi high school students' attitudes and barriers toward the use of computer technologies in learning English.

    Science.gov (United States)

    Sabti, Ahmed Abdulateef; Chaichan, Rasha Sami

    2014-01-01

    This study examines the attitudes of Saudi Arabian high school students toward the use of computer technologies in learning English. The study also discusses the possible barriers that affect and limit the actual usage of computers. Quantitative approach is applied in this research, which involved 30 Saudi Arabia students of a high school in Kuala Lumpur, Malaysia. The respondents comprised 15 males and 15 females with ages between 16 years and 18 years. Two instruments, namely, Scale of Attitude toward Computer Technologies (SACT) and Barriers affecting Students' Attitudes and Use (BSAU) were used to collect data. The Technology Acceptance Model (TAM) of Davis (1989) was utilized. The analysis of the study revealed gender differences in attitudes toward the use of computer technologies in learning English. Female students showed high and positive attitudes towards the use of computer technologies in learning English than males. Both male and female participants demonstrated high and positive perception of Usefulness and perceived Ease of Use of computer technologies in learning English. Three barriers that affected and limited the use of computer technologies in learning English were identified by the participants. These barriers are skill, equipment, and motivation. Among these barriers, skill had the highest effect, whereas motivation showed the least effect.

  16. Evidence for phosphorus bonding in phosphorus trichloride-methanol adduct: a matrix isolation infrared and ab initio computational study.

    Science.gov (United States)

    Joshi, Prasad Ramesh; Ramanathan, N; Sundararajan, K; Sankaran, K

    2015-04-09

    The weak interaction between PCl3 and CH3OH was investigated using matrix isolation infrared spectroscopy and ab initio computations. In a nitrogen matrix at low temperature, the noncovalent adduct was generated and characterized using Fourier transform infrared spectroscopy. Computations were performed at B3LYP/6-311++G(d,p), B3LYP/aug-cc-pVDZ, and MP2/6-311++G(d,p) levels of theory to optimize the possible geometries of PCl3-CH3OH adducts. Computations revealed two minima on the potential energy surface, of which, the global minimum is stabilized by a noncovalent P···O interaction, known as a pnictogen bonding (phosphorus bonding or P-bonding). The local minimum corresponded to a cyclic adduct, stabilized by the conventional hydrogen bonding (Cl···H-O and Cl···H-C interactions). Experimentally, 1:1 P-bonded PCl3-CH3OH adduct in nitrogen matrix was identified, where shifts in the P-Cl modes of PCl3, O-C, and O-H modes of CH3OH submolecules were observed. The observed vibrational frequencies of the P-bonded adduct in a nitrogen matrix agreed well with the computed frequencies. Furthermore, computations also predicted that the P-bonded adduct is stronger than H-bonded adduct by ∼1.56 kcal/mol. Atoms in molecules and natural bond orbital analyses were performed to understand the nature of interactions and effect of charge transfer interaction on the stability of the adducts.

  17. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  18. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  19. Dealing with media distractions: An observational study of computer-based multitasking among children and adults in the Netherlands

    NARCIS (Netherlands)

    Baumgartner, S.E.; Sumter, S.R.

    2017-01-01

    The aim of this observational study was to investigate differences in computer-based multitasking among children and adults. Moreover, the study investigated how attention problems are related to computer-based multitasking and how these individual differences interact with age. Computer-based

  20. FISS: a computer program for reactor systems studies

    International Nuclear Information System (INIS)

    Tamm, H.; Sherman, G.R.; Wright, J.H.; Nieman, R.E.

    1979-08-01

    ΣFISSΣ is a computer code for use in investigating alternative fuel cycle strategies for Canadian and world nuclear programs. The code performs a system simulation accounting for dynamic effects of growing nuclear systems. Facilities in the model include storage for irradiated fuel, mines, plants for enrichment, fuel fabrication, fuel reprocessing and heavy water, and reactors. FISS is particularly useful for comparing various reactor strategies and studying sensitivities of resource consumption, capital investment and energy costs with changes in fuel cycle parameters, reactor parameters and financial variables. (author)

  1. A Case Study of Educational Computer Game Design by Middle School Students

    Science.gov (United States)

    An, Yun-Jo

    2016-01-01

    Only a limited number of research studies have investigated how students design educational computer games and its impact on student learning. In addition, most studies on educational game design by students were conducted in the areas of mathematics and science. Using the qualitative case study approach, this study explored how seventh graders…

  2. INFORMATION TECHNOLOGY USERS´ ABILITIES: A CASE STUDY ON COMPUTING LEARNING IN AN UNDERGRADUATE COURSE

    Directory of Open Access Journals (Sweden)

    Valéria Maria Martins Judice

    2006-11-01

    Full Text Available Literature review shows minimum ability levels of Information Technology (IT resources in use are currently essential to administrators and to professionals overall. As effective as Internet may be, new milestones for economic competition and company survival are being created. It is thus required that individual IT abilities are continuously reformulated to be adequately and creatively used, and new information sources and tools actively generated, rather than passively adopted. In evaluating the evolution of the IT abilities’ acquisition in Brazil, students of Business & Administration from a university are investigated. By means of questionnaire and in-depth interview application, data were collected on students’ perceptions of acquired abilities and importance of IT competencies. Together, computing science teachers and a course coordinator views were assessed. Empirical results obtained revealed that students' IT abilities were concentrated on basic computing science functions. The integration of IT learning in classroom practices was deemed poor as compared to importance attributed. Students signalized self-sufficiency or knowledge attitudes which, as tested, have not been actually proved. Low learning results were observed on IT conceptual knowledge, indicating students’ impatience with learning without interaction, as in long-text readings or teacher-centered classes. Strong student resistance to electronic commerce was evidenced and associated to perceived risks on IT evolution.

  3. New accountant job market reform by computer algorithm: an experimental study

    Directory of Open Access Journals (Sweden)

    Hirose Yoshitaka

    2017-01-01

    Full Text Available The purpose of this study is to examine the matching of new accountants with accounting firms in Japan. A notable feature of the present study is that it brings a computer algorithm to the job-hiring task. Job recruitment activities for new accountants in Japan are one-time, short-term struggles. Accordingly, many have searched for new rules to replace the current ones of the process. Job recruitment activities for new accountants in Japan change every year. This study proposes modifying these job recruitment activities by combining computer and human efforts. Furthermore, the study formulates the job recruitment activities by using a model and conducting experiments. As a result, the Deferred Acceptance (DA algorithm derives a high truth-telling percentage, a stable matching percentage, and greater efficiency compared with the previous approach. This suggests the potential of the Deferred Acceptance algorithm as a replacement for current approaches. In terms of accurate percentage and stability, the DA algorithm is superior to the current methods and should be adopted.

  4. Use of high-speed cinematography and computer generated gait diagrams for the study of equine hindlimb kinematics.

    Science.gov (United States)

    Kobluk, C N; Schnurr, D; Horney, F D; Sumner-Smith, G; Willoughby, R A; Dekleer, V; Hearn, T C

    1989-01-01

    High-speed cinematography with computer aided analysis was used to study equine hindlimb kinematics. Eight horses were filmed at the trot or the pace. Filming was done from the side (lateral) and the back (caudal). Parameters measured from the lateral filming included the heights of the tuber coxae and tailhead, protraction and retraction of the hoof and angular changes of the tarsus and stifle. Abduction and adduction of the limb and tarsal height changes were measured from the caudal filming. The maximum and minimum values plus the standard deviations and coefficients of variations are presented in tabular form. Three gait diagrams were constructed to represent stifle angle versus tarsal angle, metatarsophalangeal height versus protraction-retraction (fetlock height diagram) and tuber coxae and tailhead height versus stride (pelvic height diagram). Application of the technique to the group of horses revealed good repeatability of the gait diagrams within a limb and the diagrams appeared to be sensitive indicators of left/right asymmetries.

  5. Computer Simulations Reveal Multiple Functions for Aromatic Residues in Cellulase Enzymes (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2012-07-01

    NREL researchers use high-performance computing to demonstrate fundamental roles of aromatic residues in cellulase enzyme tunnels. National Renewable Energy Laboratory (NREL) computer simulations of a key industrial enzyme, the Trichoderma reesei Family 6 cellulase (Cel6A), predict that aromatic residues near the enzyme's active site and at the entrance and exit tunnel perform different functions in substrate binding and catalysis, depending on their location in the enzyme. These results suggest that nature employs aromatic-carbohydrate interactions with a wide variety of binding affinities for diverse functions. Outcomes also suggest that protein engineering strategies in which mutations are made around the binding sites may require tailoring specific to the enzyme family. Cellulase enzymes ubiquitously exhibit tunnels or clefts lined with aromatic residues for processing carbohydrate polymers to monomers, but the molecular-level role of these aromatic residues remains unknown. In silico mutation of the aromatic residues near the catalytic site of Cel6A has little impact on the binding affinity, but simulation suggests that these residues play a major role in the glucopyranose ring distortion necessary for cleaving glycosidic bonds to produce fermentable sugars. Removal of aromatic residues at the entrance and exit of the cellulase tunnel, however, dramatically impacts the binding affinity. This suggests that these residues play a role in acquiring cellulose chains from the cellulose crystal and stabilizing the reaction product, respectively. These results illustrate that the role of aromatic-carbohydrate interactions varies dramatically depending on the position in the enzyme tunnel. As aromatic-carbohydrate interactions are present in all carbohydrate-active enzymes, the results have implications for understanding protein structure-function relationships in carbohydrate metabolism and recognition, carbon turnover in nature, and protein engineering

  6. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Park, Joo Hwan

    2010-09-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect numerical computations on the detailed shape of rod bundle due to challenges with computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers and bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve complex geometry such as a fuel rod bundle. Before applying a solution to the problem of the 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to simple geometry. The split channel method has been proposed with the aim of computing the fully shaped CANDU fuel channel with detailed components. The validity was tested by applying the method to the single channel problem. The average temperature have similar values for the considered two methods, while the local temperature shows a slight difference by the effect of conduction heat transfer in the solid region of a rod. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for future work

  7. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  8. Comparison of Computational Algorithms for the Classification of Liver Cancer using SELDI Mass Spectrometry: A Case Study

    Directory of Open Access Journals (Sweden)

    Robert J Hickey

    2007-01-01

    Full Text Available Introduction: As an alternative to DNA microarrays, mass spectrometry based analysis of proteomic patterns has shown great potential in cancer diagnosis. The ultimate application of this technique in clinical settings relies on the advancement of the technology itself and the maturity of the computational tools used to analyze the data. A number of computational algorithms constructed on different principles are available for the classification of disease status based on proteomic patterns. Nevertheless, few studies have addressed the difference in the performance of these approaches. In this report, we describe a comparative case study on the classification accuracy of hepatocellular carcinoma based on the serum proteomic pattern generated from a Surface Enhanced Laser Desorption/Ionization (SELDI mass spectrometer.Methods: Nine supervised classifi cation algorithms are implemented in R software and compared for the classification accuracy.Results: We found that the support vector machine with radial function is preferable as a tool for classification of hepatocellular carcinoma using features in SELDI mass spectra. Among the rest of the methods, random forest and prediction analysis of microarrays have better performance. A permutation-based technique reveals that the support vector machine with a radial function seems intrinsically superior in learning from the training data since it has a lower prediction error than others when there is essentially no differential signal. On the other hand, the performance of the random forest and prediction analysis of microarrays rely on their capability of capturing the signals with substantial differentiation between groups.Conclusions: Our finding is similar to a previous study, where classification methods based on the Matrix Assisted Laser Desorption/Ionization (MALDI mass spectrometry are compared for the prediction accuracy of ovarian cancer. The support vector machine, random forest and prediction

  9. ‘Shift’ ‘n ‘control’: The computer as a third interactant in Spanish-language

    Science.gov (United States)

    Goble, Ryan; Vickers, Caroline H

    2015-01-01

    The purpose of this paper is to examine the role of the computer in medical consultations in which English- Spanish-bilingual medical providers interact with Spanish-monolingual patients. Following previous studies that have revealed that the presence of the computer in consultations detracts from direct provider– patient communication, we pay specific attention to how the use of the computer in Spanish-language medical consultations can complement or adversely affect the co-construction of the patient’s health narrative. The data for the present study consist of 36 Spanish-language medical consultations in Southern California. Applying a conversation analytical approach to the health narratives in the corpus, we argue that the computer is essentially a third interactant to which medical providers orient through lowered volume, minimal responses, bureaucratic side talk, and, most importantly, code-switching to English – all of which strip the patients of control over the co-construction of their health narrative with their medical provider. Because the patient does not have access to the computational task and the language, we posit that this exacerbates the already existing adverse effects that the computer has on provider–patient interaction.

  10. Combining on-chip synthesis of a focused combinatorial library with computational target prediction reveals imidazopyridine GPCR ligands.

    Science.gov (United States)

    Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert

    2014-01-07

    Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    Science.gov (United States)

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  12. Labels, Cognomes and Cyclic Computation: An Ethological Perspective

    Directory of Open Access Journals (Sweden)

    Elliot eMurphy

    2015-06-01

    Full Text Available For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i survey what the ethological record reveals about the uniqueness of the human computational system, and (ii explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested.

  13. Labels, cognomes, and cyclic computation: an ethological perspective.

    Science.gov (United States)

    Murphy, Elliot

    2015-01-01

    For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i) survey what the ethological record reveals about the uniqueness of the human computational system, and (ii) explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested.

  14. Cloud Computing: A study of cloud architecture and its patterns

    OpenAIRE

    Mandeep Handa,; Shriya Sharma

    2015-01-01

    Cloud computing is a general term for anything that involves delivering hosted services over the Internet. Cloud computing is a paradigm shift following the shift from mainframe to client–server in the early 1980s. Cloud computing can be defined as accessing third party software and services on web and paying as per usage. It facilitates scalability and virtualized resources over Internet as a service providing cost effective and scalable solution to customers. Cloud computing has...

  15. Computer-aided diagnosis of contrast-enhanced spectral mammography: A feasibility study.

    Science.gov (United States)

    Patel, Bhavika K; Ranjbar, Sara; Wu, Teresa; Pockaj, Barbara A; Li, Jing; Zhang, Nan; Lobbes, Mark; Zhang, Bin; Mitchell, J Ross

    2018-01-01

    To evaluate whether the use of a computer-aided diagnosis-contrast-enhanced spectral mammography (CAD-CESM) tool can further increase the diagnostic performance of CESM compared with that of experienced radiologists. This IRB-approved retrospective study analyzed 50 lesions described on CESM from August 2014 to December 2015. Histopathologic analyses, used as the criterion standard, revealed 24 benign and 26 malignant lesions. An expert breast radiologist manually outlined lesion boundaries on the different views. A set of morphologic and textural features were then extracted from the low-energy and recombined images. Machine-learning algorithms with feature selection were used along with statistical analysis to reduce, select, and combine features. Selected features were then used to construct a predictive model using a support vector machine (SVM) classification method in a leave-one-out-cross-validation approach. The classification performance was compared against the diagnostic predictions of 2 breast radiologists with access to the same CESM cases. Based on the SVM classification, CAD-CESM correctly identified 45 of 50 lesions in the cohort, resulting in an overall accuracy of 90%. The detection rate for the malignant group was 88% (3 false-negative cases) and 92% for the benign group (2 false-positive cases). Compared with the model, radiologist 1 had an overall accuracy of 78% and a detection rate of 92% (2 false-negative cases) for the malignant group and 62% (10 false-positive cases) for the benign group. Radiologist 2 had an overall accuracy of 86% and a detection rate of 100% for the malignant group and 71% (8 false-positive cases) for the benign group. The results of our feasibility study suggest that a CAD-CESM tool can provide complementary information to radiologists, mainly by reducing the number of false-positive findings. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Nanostructured interfaces for enhancing mechanical properties of composites: Computational micromechanical studies

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon

    2015-01-01

    Computational micromechanical studies of the effect of nanostructuring and nanoengineering of interfaces, phase and grain boundaries of materials on the mechanical properties and strength of materials and the potential of interface nanostructuring to enhance the materials properties are reviewed....

  17. Basicities of Strong Bases in Water: A Computational Study

    OpenAIRE

    Kaupmees, Karl; Trummal, Aleksander; Leito, Ivo

    2014-01-01

    Aqueous pKa values of strong organic bases – DBU, TBD, MTBD, different phosphazene bases, etc – were computed with CPCM, SMD and COSMO-RS approaches. Explicit solvent molecules were not used. Direct computations and computations with reference pKa values were used. The latter were of two types: (1) reliable experimental aqueous pKa value of a reference base with structure similar to the investigated base or (2) reliable experimental pKa value in acetonitrile of the investigated base itself. ...

  18. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  19. Numerical cosmology: Revealing the universe using computers

    International Nuclear Information System (INIS)

    Centrella, J.; Matzner, R.A.; Tolman, B.W.

    1986-01-01

    In this paper the authors present two research projects which study the evolution of different periods in the history of the universe using numerical simulations. The first investigates the synthesis of light elements in an inhomogeneous early universe dominated by shocks and non-linear gravitational waves. The second follows the evolution of large scale structures during the later history of the universe and calculates their effect on the 3K background radiation. Their simulations are carried out using modern supercomputers and make heavy use of multidimensional color graphics, including film to elucidate the results. Both projects provide the authors the opportunity to do experiments in cosmology and assess their results against fundamental cosmological observations

  20. Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.

    Science.gov (United States)

    Szturm, Tony; Reimer, Karen M; Hochman, Jordan

    2015-06-01

    Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).

  1. Combined computational and biochemical study reveals the importance of electrostatic interactions between the "pH sensor" and the cation binding site of the sodium/proton antiporter NhaA of Escherichia coli.

    Science.gov (United States)

    Olkhova, Elena; Kozachkov, Lena; Padan, Etana; Michel, Hartmut

    2009-08-15

    Sodium proton antiporters are essential enzymes that catalyze the exchange of sodium ions for protons across biological membranes. The crystal structure of NhaA has provided a basis to explore the mechanism of ion exchange and its unique regulation by pH. Here, the mechanism of the pH activation of the antiporter is investigated through functional and computational studies of several variants with mutations in the ion-binding site (D163, D164). The most significant difference found computationally between the wild type antiporter and the active site variants, D163E and D164N, are low pK(a) values of Glu78 making them insensitive to pH. Although in the variant D163N the pK(a) of Glu78 is comparable to the physiological one, this variant cannot demonstrate the long-range electrostatic effect of Glu78 on the pH-dependent structural reorganization of trans-membrane helix X and, hence, is proposed to be inactive. In marked contrast, variant D164E remains sensitive to pH and can be activated by alkaline pH shift. Remarkably, as expected computationally and discovered here biochemically, D164E is viable and active in Na(+)/H(+) exchange albeit with increased apparent K(M). Our results unravel the unique electrostatic network of NhaA that connect the coupled clusters of the "pH sensor" with the binding site, which is crucial for pH activation of NhaA. 2009 Wiley-Liss, Inc.

  2. Connectivity to computers and the Internet among patients with schizophrenia spectrum disorders: a cross-sectional study.

    Science.gov (United States)

    Välimäki, Maritta; Kuosmanen, Lauri; Hätönen, Heli; Koivunen, Marita; Pitkänen, Anneli; Athanasopoulou, Christina; Anttila, Minna

    2017-01-01

    Information and communication technologies have been developed for a variety of health care applications and user groups in the field of health care. This study examined the connectivity to computers and the Internet among patients with schizophrenia spectrum disorders (SSDs). A cross-sectional survey design was used to study 311 adults with SSDs from the inpatient units of two psychiatric hospitals in Finland. The data collection lasted for 20 months and was done through patients' medical records and a self-reported, structured questionnaire. Data analysis included descriptive statistics. In total, 297 patients were included in this study (response rate =96%). More than half of them (n=156; 55%) had a computer and less than half of them (n=127; 44%) had the Internet at home. Of those who generally had access to computers and the Internet, more than one-fourth (n=85; 29%) used computers daily, and >30% (n=96; 33%) never accessed the Internet. In total, approximately one-fourth of them (n=134; 25%) learned to use computers, and less than one-third of them (n=143; 31%) were known to use the Internet by themselves. Older people (aged 45-65 years) and those with less years of education (primary school) tended not to use the computers and the Internet at all ( P computers and the Internet, and they mainly used the Internet to seek information. Social, occupational, and psychological functioning (which were evaluated with Global Assessment of Functioning) were not associated with access to and frequency of computer and the Internet use. The results support the use of computers and the Internet as part of clinical work in mental health care.

  3. Replacement of traditional lectures with computer-based tutorials: a case study

    Directory of Open Access Journals (Sweden)

    Derek Lavelle

    1996-12-01

    Full Text Available This paper reports on a pilot project with a group of 60 second-year undergraduates studying the use of standard forms of contract in the construction industry. The project entailed the replacement of two of a series of nine scheduled lectures with a computer-based tutorial. The two main aims of the project were to test the viability of converting existing lecture material into computer-based material on an in-house production basis, and to obtain feedback from the student cohort on their behavioural response to the change in media. The effect on student performance was not measured at this stage of development.

  4. Inleiding: 'History of computing'. Geschiedschrijving over computers en computergebruik in Nederland

    Directory of Open Access Journals (Sweden)

    Adrienne van den Boogaard

    2008-06-01

    Full Text Available Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software.It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such “home” interest, started in 1987 with the work of Eda Kranakis – then active in The Netherlands – commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard.Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh very early on in the dissertation by Ruud van Dael, Something to do with computers (2001 revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works

  5. Evolution of Computed Tomography Findings in Secondary Aortoenteric Fistula

    International Nuclear Information System (INIS)

    Bas, Ahmet; Simsek, Osman; Kandemirli, Sedat Giray; Rafiee, Babak; Gulsen, Fatih; Numan, Furuzan

    2015-01-01

    Aortoenteric fistula is a rare but significant clinical entity associated with high morbidity and mortality if remain untreated. Clinical presentation and imaging findings may be subtle and prompt diagnosis can be difficult. Herein, we present a patient who initially presented with abdominal pain and computed tomography showed an aortic aneurysm compressing duodenum without any air bubbles. One month later, the patient presented with gastrointestinal bleeding and computed tomography revealed air bubbles within aneurysm. With a diagnosis of aortoenteric fistula, endovascular aneurysm repair was carried out. This case uniquely presented the computed tomography findings in progression of an aneurysm to an aortoenteric fistula

  6. Design of large scale applications of secure multiparty computation : secure linear programming

    NARCIS (Netherlands)

    Hoogh, de S.J.A.

    2012-01-01

    Secure multiparty computation is a basic concept of growing interest in modern cryptography. It allows a set of mutually distrusting parties to perform a computation on their private information in such a way that as little as possible is revealed about each private input. The early results of

  7. Computational study of depth completion consistent with human bi-stable perception for ambiguous figures.

    Science.gov (United States)

    Mitsukura, Eiichi; Satoh, Shunji

    2018-03-01

    We propose a computational model that is consistent with human perception of depth in "ambiguous regions," in which no binocular disparity exists. Results obtained from our model reveal a new characteristic of depth perception. Random dot stereograms (RDS) are often used as examples because RDS provides sufficient disparity for depth calculation. A simple question confronts us: "How can we estimate the depth of a no-texture image region, such as one on white paper?" In such ambiguous regions, mathematical solutions related to binocular disparities are not unique or indefinite. We examine a mathematical description of depth completion that is consistent with human perception of depth for ambiguous regions. Using computer simulation, we demonstrate that resultant depth-maps qualitatively reproduce human depth perception of two kinds. The resultant depth maps produced using our model depend on the initial depth in the ambiguous region. Considering this dependence from psychological viewpoints, we conjecture that humans perceive completed surfaces that are affected by prior-stimuli corresponding to the initial condition of depth. We conducted psychological experiments to verify the model prediction. An ambiguous stimulus was presented after a prior stimulus removed ambiguity. The inter-stimulus interval (ISI) was inserted between the prior stimulus and post-stimulus. Results show that correlation of perception between the prior stimulus and post-stimulus depends on the ISI duration. Correlation is positive, negative, and nearly zero in the respective cases of short (0-200 ms), medium (200-400 ms), and long ISI (>400 ms). Furthermore, based on our model, we propose a computational model that can explain the dependence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Possibilities of computed bronchophonography in the diagnosis of external respiratory dysfunction in patients with cystic fibrosis

    Directory of Open Access Journals (Sweden)

    E. B. Pavlinova

    2016-01-01

    Full Text Available The degree of respiratory organ injury in cystic fibrosis determines the prognosis of the disease. Objective: to evaluate external respiratory function in children with cystic fibrosis. The study enrolled 48 children followed up at the Omsk Cystic Fibrosis Center. A control group consisted of 42 non-addicted smoking children with no evidence for respiratory diseases in the history. External respiratory function was evaluated using computed bronchophonography; spirography was additionally carried out in children over 6 years of age. Computed bronchophonography revealed obstructive respiratory failure in all children with severe cystic fibrosis. Chronic respiratory tract infection with Pseudomonas aeruginosa and bronchiectasis were associated with the higher values of the acoustic work of breathing at frequencies over 5000 Hz. It was established that there was a moderate negative correlation between the value of the acoustic work of breathing in the high frequency range and the forced expiratory volume in 1 second in %. Conclusion. Computed bronchophonography could reveal obstructive external respiratory dysfunction in children less than 6 years of age. 

  9. Large Scale Computing and Storage Requirements for High Energy Physics

    International Nuclear Information System (INIS)

    Gerber, Richard A.; Wasserman, Harvey

    2010-01-01

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  10. Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers

    Science.gov (United States)

    Newman, Thomas R.

    2017-01-01

    The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…

  11. Mobile computing with special reference to readability task under the impact of vibration, colour combination and gender.

    Science.gov (United States)

    Mallick, Zulquernain; Siddiquee, Arshad Noor; Haleem, Abid

    2008-12-01

    The last 20 years have seen a tremendous growth in the field of computing with special reference to mobile computing. Ergonomic issues pertaining to this theme remains unexplored. With special reference to readability in mobile computing, an experimental research was conducted to study the gender effect on human performance under the impact of vibration in a human computer interaction environment. Fourteen subjects (7 males and 7 females) participated in the study. Three independent variables, namely gender, level of vibration and screen text/background colour, were selected for the experimental investigation while the dependent variable was the number of characters read per minute. The data collected were analyzed statistically through an experimental design for repeated measures. Results indicated that gender as an organismic variable, the level of vibration and screen text/background colour revealed statistically significant differences. However, the second order interaction was found to be statistically non-significant. These findings are discussed in light of the previous studies undertaken on the topic.

  12. Conformational Dynamics of apo-GlnBP Revealed by Experimental and Computational Analysis

    KAUST Repository

    Feng, Yitao

    2016-10-13

    The glutamine binding protein (GlnBP) binds l-glutamine and cooperates with its cognate transporters during glutamine uptake. Crystal structure analysis has revealed an open and a closed conformation for apo- and holo-GlnBP, respectively. However, the detailed conformational dynamics have remained unclear. Herein, we combined NMR spectroscopy, MD simulations, and single-molecule FRET techniques to decipher the conformational dynamics of apo-GlnBP. The NMR residual dipolar couplings of apo-GlnBP were in good agreement with a MD-derived structure ensemble consisting of four metastable states. The open and closed conformations are the two major states. This four-state model was further validated by smFRET experiments and suggests the conformational selection mechanism in ligand recognition of GlnBP. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim

  13. Conformational Dynamics of apo-GlnBP Revealed by Experimental and Computational Analysis

    KAUST Repository

    Feng, Yitao; Zhang, Lu; Wu, Shaowen; Liu, Zhijun; Gao, Xin; Zhang, Xu; Liu, Maili; Liu, Jianwei; Huang, Xuhui; Wang, Wenning

    2016-01-01

    The glutamine binding protein (GlnBP) binds l-glutamine and cooperates with its cognate transporters during glutamine uptake. Crystal structure analysis has revealed an open and a closed conformation for apo- and holo-GlnBP, respectively. However, the detailed conformational dynamics have remained unclear. Herein, we combined NMR spectroscopy, MD simulations, and single-molecule FRET techniques to decipher the conformational dynamics of apo-GlnBP. The NMR residual dipolar couplings of apo-GlnBP were in good agreement with a MD-derived structure ensemble consisting of four metastable states. The open and closed conformations are the two major states. This four-state model was further validated by smFRET experiments and suggests the conformational selection mechanism in ligand recognition of GlnBP. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim

  14. Evolutionary Meta-Analysis of Association Studies Reveals Ancient Constraints Affecting Disease Marker Discovery

    Science.gov (United States)

    Dudley, Joel T.; Chen, Rong; Sanderford, Maxwell; Butte, Atul J.; Kumar, Sudhir

    2012-01-01

    Genome-wide disease association studies contrast genetic variation between disease cohorts and healthy populations to discover single nucleotide polymorphisms (SNPs) and other genetic markers revealing underlying genetic architectures of human diseases. Despite scores of efforts over the past decade, many reproducible genetic variants that explain substantial proportions of the heritable risk of common human diseases remain undiscovered. We have conducted a multispecies genomic analysis of 5,831 putative human risk variants for more than 230 disease phenotypes reported in 2,021 studies. We find that the current approaches show a propensity for discovering disease-associated SNPs (dSNPs) at conserved genomic positions because the effect size (odds ratio) and allelic P value of genetic association of an SNP relates strongly to the evolutionary conservation of their genomic position. We propose a new measure for ranking SNPs that integrates evolutionary conservation scores and the P value (E-rank). Using published data from a large case-control study, we demonstrate that E-rank method prioritizes SNPs with a greater likelihood of bona fide and reproducible genetic disease associations, many of which may explain greater proportions of genetic variance. Therefore, long-term evolutionary histories of genomic positions offer key practical utility in reassessing data from existing disease association studies, and in the design and analysis of future studies aimed at revealing the genetic basis of common human diseases. PMID:22389448

  15. The distribution of cerebral muscarinic acetylcholine receptors in vivo in patients with dementia. A controlled study with 123IQNB and single photon emission computed tomography

    International Nuclear Information System (INIS)

    Weinberger, D.R.; Gibson, R.; Coppola, R.; Jones, D.W.; Molchan, S.; Sunderland, T.; Berman, K.F.; Reba, R.C.

    1991-01-01

    A high-affinity muscarinic receptor antagonist, 123IQNB (3-quinuclidinyl-4-iodobenzilate labeled with iodine 123), was used with single photon emission computed tomography to image muscarinic acetylcholine receptors in 14 patients with dementia and in 11 healthy controls. High-resolution single photon emission computed tomographic scanning was performed 21 hours after the intravenous administration of approximately 5 mCi of IQNB. In normal subjects, the images of retained ligand showed a consistent regional pattern that correlated with postmortem studies of the relative distribution of muscarinic receptors in the normal human brain, having high radioactivity counts in the basal ganglia, occipital cortex, and insular cortex, low counts in the thalamus, and virtually no counts in the cerebellum. Eight of 12 patients with a clinical diagnosis of Alzheimer's disease had obvious focal cortical defects in either frontal or posterior temporal cortex. Both patients with a clinical diagnosis of Pick's disease had obvious frontal and anterior temporal defects. A region of interest statistical analysis of relative regional activity revealed a significant reduction bilaterally in the posterior temporal cortex of the patients with Alzheimer's disease compared with controls. This study demonstrates the practicability of acetylcholine receptor imaging with 123IQNB and single photon emission computed tomography. The data suggest that focal abnormalities in muscarinic binding in vivo may characterize some patients with Alzheimer's disease and Pick's disease, but further studies are needed to address questions about partial volume artifacts and receptor quantification

  16. The influence of computer games on the development and degradation of society

    Directory of Open Access Journals (Sweden)

    Golikov A. M.

    2018-05-01

    Full Text Available the article is devoted to the study of the influence of game industry products, both on the individual and on society. For a full analysis origin of such phenomena as computer games is represented. After that, the growth rate of the "players" is seen. Revealing the most popular game theme, the thesis about the harm of computer games for a number of reasons. The relevance of studying the harm resulting from the products of the gaming industry is argued. Having reviewed the positive effects of the games on the person, the presence of the good sides of this action is claimed. In conclusion, the question is raised about the true impact of the products of the gaming industry on society, as well as the consequences of this influence.

  17. Evaluation of the setup margins for cone beam computed tomography–guided cranial radiosurgery: A phantom study

    Energy Technology Data Exchange (ETDEWEB)

    Calvo Ortega, Juan Francisco, E-mail: jfcdrr@yahoo.es [Department of Radiation Oncology, Hospital Quirón, Barcelona (Spain); Wunderink, Wouter [Department of Radiotherapy, Erasmus MC Cancer Institute, University Medical Center Rotterdam, Rotterdam (Netherlands); Delgado, David; Moragues, Sandra; Pozo, Miquel; Casals, Joan [Department of Radiation Oncology, Hospital Quirón, Barcelona (Spain)

    2016-10-01

    The aim of this study is to evaluate the setup margins from the clinical target volume (CTV) to planning target volume (PTV) for cranial stereotactic radiosurgery (SRS) treatments guided by cone beam computed tomography (CBCT). We designed an end-to-end (E2E) test using a skull phantom with an embedded 6mm tungsten ball (target). A noncoplanar plan was computed (E2E plan) to irradiate the target. The CBCT-guided positioning of the skull phantom on the linac was performed. Megavoltage portal images were acquired after 15 independent deliveries of the E2E plan. The displacement 2-dimensional (2D) vector between the centers of the square field and the ball target on each portal image was used to quantify the isocenter accuracy. Geometrical margins on each patient's direction (left-right or LR, anterior-posterior or AP, superior-inferior or SI) were calculated. Dosimetric validation of the margins was performed in 5 real SRS cases: 3-dimesional (3D) isocenter deviations were mimicked, and changes in CTV dose coverage and organs-at-risk (OARs) dosage were analyzed. The CTV-PTV margins of 1.1 mm in LR direction, and 0.7 mm in AP and SI directions were derived from the E2E tests. The dosimetric analysis revealed that a 1-mm uniform margin was sufficient to ensure the CTV dose coverage, without compromising the OAR dose tolerances. The effect of isocenter uncertainty has been estimated to be 1 mm in our CBCT-guided SRS approach.

  18. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  19. Studying Computer Science in a Multidisciplinary Degree Programme: Freshman Students' Orientation, Knowledge, and Background

    Science.gov (United States)

    Kautz, Karlheinz; Kofoed, Uffe

    2004-01-01

    Teachers at universities are facing an increasing disparity in students' prior IT knowledge and, at the same time, experience a growing disengagement of the students with regard to involvement in study activities. As computer science teachers in a joint programme in computer science and business administration, we made a number of similar…

  20. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    Science.gov (United States)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  1. An esthetics rehabilitation with computer-aided design/ computer-aided manufacturing technology.

    Science.gov (United States)

    Mazaro, Josá Vitor Quinelli; de Mello, Caroline Cantieri; Zavanelli, Adriana Cristina; Santiago, Joel Ferreira; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza

    2014-07-01

    This paper describes a case of a rehabilitation involving Computer Aided Design/Computer Aided Manufacturing (CAD-CAM) system in implant supported and dental supported prostheses using zirconia as framework. The CAD-CAM technology has developed considerably over last few years, becoming a reality in dental practice. Among the widely used systems are the systems based on zirconia which demonstrate important physical and mechanical properties of high strength, adequate fracture toughness, biocompatibility and esthetics, and are indicated for unitary prosthetic restorations and posterior and anterior framework. All the modeling was performed by using CAD-CAM system and prostheses were cemented using resin cement best suited for each situation. The rehabilitation of the maxillary arch using zirconia framework demonstrated satisfactory esthetic and functional results after a 12-month control and revealed no biological and technical complications. This article shows the important of use technology CAD/CAM in the manufacture of dental prosthesis and implant-supported.

  2. Dissociated dislocations in Ni: a computational study

    International Nuclear Information System (INIS)

    Szelestey, P.; Patriarca, M.; Kaski, K.

    2005-01-01

    A systematic computational study of the behavior of a (1/2) dissociated screw dislocation in fcc nickel is presented, in which atomic interactions are described through an embedded-atom potential. A suitable external stress is applied on the system, both for modifying the equilibrium separation distance d and moving the dislocation complex. The structure of the dislocation and its corresponding changes during the motion are studied in the framework of the two-dimensional Peierls model, for different values of the ratio d/a', where a' is the period of the Peierls potential. The distance between the edge and screw components of the partials, as well as their widths, undergo a modulation with period a', as the dislocation moves, and the amplitudes of such oscillations are shown to depend on d/a'. The stress profile acting on the dislocation complex is analyzed and the effective Peierls stress is estimated for different values of d/a'

  3. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  4. Interactive Rhythm Learning System by Combining Tablet Computers and Robots

    Directory of Open Access Journals (Sweden)

    Chien-Hsing Chou

    2017-03-01

    Full Text Available This study proposes a percussion learning device that combines tablet computers and robots. This device comprises two systems: a rhythm teaching system, in which users can compose and practice rhythms by using a tablet computer, and a robot performance system. First, teachers compose the rhythm training contents on the tablet computer. Then, the learners practice these percussion exercises by using the tablet computer and a small drum set. The teaching system provides a new and user-friendly score editing interface for composing a rhythm exercise. It also provides a rhythm rating function to facilitate percussion training for children and improve the stability of rhythmic beating. To encourage children to practice percussion exercises, a robotic performance system is used to interact with the children; this system can perform percussion exercises for students to listen to and then help them practice the exercise. This interaction enhances children’s interest and motivation to learn and practice rhythm exercises. The results of experimental course and field trials reveal that the proposed system not only increases students’ interest and efficiency in learning but also helps them in understanding musical rhythms through interaction and composing simple rhythms.

  5. Computing camera heading: A study

    Science.gov (United States)

    Zhang, John Jiaxiang

    2000-08-01

    An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.

  6. Learning by Computer Simulation Does Not Lead to Better Test Performance on Advanced Cardiac Life Support Than Textbook Study.

    Science.gov (United States)

    Kim, Jong Hoon; Kim, Won Oak; Min, Kyeong Tae; Yang, Jong Yoon; Nam, Yong Taek

    2002-01-01

    For an effective acquisition and the practical application of rapidly increasing amounts of information, computer-based learning has already been introduced in medical education. However, there have been few studies that compare this innovative method to traditional learning methods in studying advanced cardiac life support (ACLS). Senior medical students were randomized to computer simulation and a textbook study. Each group studied ACLS for 150 minutes. Tests were done one week before, immediately after, and one week after the study period. Testing consisted of 20 questions. All questions were formulated in such a way that there was a single best answer. Each student also completed a questionnaire designed to assess computer skills as well as satisfaction with and benefit from the study materials. Test scores improved after both textbook study and computer simulation study in both groups but the improvement in scores was significantly higher for the textbook group only immediately after the study. There was no significant difference between groups in their computer skill and satisfaction with the study materials. The textbook group reported greater benefit from study materials than did the computer simulation group. Studying ACLS with a hard copy textbook may be more effective than computer simulation for the acquisition of simple information during a brief period. However, the difference in effectiveness is likely transient.

  7. USING RESEARCH METHODS IN HUMAN COMPUTER INTERACTION TO DESIGN TECHNOLOGY FOR RESILIENCE

    OpenAIRE

    Lopes, Arminda Guerra

    2016-01-01

    ABSTRACT Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, ...

  8. The Cognitive Predictors of Computational Skill with Whole versus Rational Numbers: An Exploratory Study

    Science.gov (United States)

    Seethaler, Pamela M.; Fuchs, Lynn S.; Star, Jon R.; Bryant, Joan

    2011-01-01

    The purpose of the present study was to explore the 3rd-grade cognitive predictors of 5th-grade computational skill with rational numbers and how those are similar to and different from the cognitive predictors of whole-number computational skill. Students (n=688) were assessed on incoming whole-number calculation skill, language, nonverbal…

  9. Azadirachtin(A) distinctively modulates subdomain 2 of actin - novel mechanism to induce depolymerization revealed by molecular dynamics study.

    Science.gov (United States)

    Pravin Kumar, R; Roopa, L; Sudheer Mohammed, M M; Kulkarni, Naveen

    2016-12-01

    Azadirachtin(A) (AZA), a potential insecticide from neem, binds to actin and induces depolymerization in Drosophila. AZA binds to the pocket same as that of Latrunculin A (LAT), but LAT inhibits actin polymerization by stiffening the actin structure and affects the ADP-ATP exchange. The mechanism by which AZA induces actin depolymerization is not clearly understood. Therefore, different computational experiments were conducted to delineate the precise mechanism of AZA-induced actin depolymerization. Molecular dynamics studies showed that AZA strongly interacted with subdomain 2 and destabilized the interactions between subdomain 2 of one actin and subdomains 1 and 4 of the adjacent actin, causing the separation of actin subunits. The separation was observed between subdomain 3 of subunit n and subdomain 4 of subunit n + 2. However, the specific triggering point for the separation of the subunits was the destabilization of direct interactions between subdomain 2 of subunit n (Arg39, Val45, Gly46 and Arg62) and subdomain 4 of subunit n + 2 (Asp286, Ile287, Asp288, Ile289, Asp244 and Lys291). These results reveal a unique mechanism of an actin filament modulator that induces depolymerization. This mechanism of AZA can be used to design similar molecules against mammalian actins for cancer therapy.

  10. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  11. Computational Investigation of Amine–Oxygen Exciplex Formation

    Science.gov (United States)

    Haupert, Levi M.; Simpson, Garth J.; Slipchenko, Lyudmila V.

    2012-01-01

    It has been suggested that fluorescence from amine-containing dendrimer compounds could be the result of a charge transfer between amine groups and molecular oxygen [Chu, C.-C.; Imae, T. Macromol. Rapid Commun. 2009, 30, 89.]. In this paper we employ equation-of-motion coupled cluster computational methods to study the electronic structure of an ammonia–oxygen model complex to examine this possibility. The results reveal several bound electronic states with charge transfer character with emission energies generally consistent with previous observations. However, further work involving confinement, solvent, and amine structure effects will be necessary for more rigorous examination of the charge transfer fluorescence hypothesis. PMID:21812447

  12. Stroke patients' utilisation of extrinsic feedback from computer-based technology in the home: a multiple case study realistic evaluation.

    Science.gov (United States)

    Parker, Jack; Mawson, Susan; Mountain, Gail; Nasr, Nasrin; Zheng, Huiru

    2014-06-05

    Evidence indicates that post-stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner.Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, 'what works for whom and in what circumstances and respects?' Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Findings suggest that the theory-driven mechanisms

  13. Instrumentation, computer software and experimental techniques used in low-frequency internal friction studies at WNRE

    International Nuclear Information System (INIS)

    Sprugmann, K.W.; Ritchie, I.G.

    1980-04-01

    A detailed and comprehensive account of the equipment, computer programs and experimental methods developed at the Whiteshell Nuclear Research Estalbishment for the study of low-frequency internal friction is presented. Part 1 describes the mechanical apparatus, electronic instrumentation and computer software, while Part II describes in detail the laboratory techniques and various types of experiments performed together with data reduction and analysis. Experimental procedures for the study of internal friction as a function of temperature, strain amplitude or time are described. Computer control of these experiments using the free-decay technique is outlined. In addition, a pendulum constant-amplitude drive system is described. (auth)

  14. Computed tomography of intussusception in adult

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Ahn, Byeong Yeob; Cha, Soon Joo; Seol, Hae Young; Chung, Kyoo Byung; Suh, Won Hyuck

    1984-01-01

    Intussusception is rare in adult and usually caused by organic lesions, although there is a significant number of so-called idiopathic cases. The diagnosis of intussusception have been made by plain abdomen, barium enema and small bowel series. But recently ultrasound and computed tomography make a contribution to diagnose intussusception. Computed tomography is not the primary means for evaluating a gastrointestinal tract abnormality but also provides valuable information in evaluating disorders affecting the hollow viscera of the alimentary tract. Computed tomography image of intussusception demonstrates a whirl like pattern of bowel loops separated by fatty stripe correlating of the intestinal walls. Abdominal ultrasonogram was used as the initial diagnostic test in 2 cases out of total 4 cases, with abdominal mass of unknown cause. It revealed a typical pattern, composed of a round or oval mass with central dense echoes and peripheral poor echoes. We report 4 all cases of intussusception in adult who were performed by computed tomography and/or ultrasound. All cases were correlated with barium enema examination and/or surgical reports.

  15. Preservice Teacher Sense-Making as They Learn to Teach Reading as Seen through Computer-Mediated Discourse

    Science.gov (United States)

    Stefanski, Angela J.; Leitze, Amy; Fife-Demski, Veronica M.

    2018-01-01

    This collective case study used methods of discourse analysis to consider what computer-mediated collaboration might reveal about preservice teachers' sense-making in a field-based practicum as they learn to teach reading to children identified as struggling readers. Researchers agree that field-based experiences coupled with time for reflection…

  16. Evidence-based ergonomics education: Promoting risk factor awareness among office computer workers.

    Science.gov (United States)

    Mani, Karthik; Provident, Ingrid; Eckel, Emily

    2016-01-01

    Work-related musculoskeletal disorders (WMSDs) related to computer work have become a serious public health concern. Literature revealed a positive association between computer use and WMSDs. The purpose of this evidence-based pilot project was to provide a series of evidence-based educational sessions on ergonomics to office computer workers to enhance the awareness of risk factors of WMSDs. Seventeen office computer workers who work for the National Board of Certification in Occupational Therapy volunteered for this project. Each participant completed a baseline and post-intervention ergonomics questionnaire and attended six educational sessions. The Rapid Office Strain Assessment and an ergonomics questionnaire were used for data collection. The post-intervention data revealed that 89% of participants were able to identify a greater number of risk factors and answer more questions correctly in knowledge tests of the ergonomics questionnaire. Pre- and post-intervention comparisons showed changes in work posture and behaviors (taking rest breaks, participating in exercise, adjusting workstation) of participants. The findings have implications for injury prevention in office settings and suggest that ergonomics education may yield positive knowledge and behavioral changes among computer workers.

  17. Nurses' computer literacy and attitudes towards the use of computers in health care.

    Science.gov (United States)

    Gürdaş Topkaya, Sati; Kaya, Nurten

    2015-05-01

    This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.

  18. Hafnium-Based Contrast Agents for X-ray Computed Tomography.

    Science.gov (United States)

    Berger, Markus; Bauser, Marcus; Frenzel, Thomas; Hilger, Christoph Stephan; Jost, Gregor; Lauria, Silvia; Morgenstern, Bernd; Neis, Christian; Pietsch, Hubertus; Sülzle, Detlev; Hegetschweiler, Kaspar

    2017-05-15

    Heavy-metal-based contrast agents (CAs) offer enhanced X-ray absorption for X-ray computed tomography (CT) compared to the currently used iodinated CAs. We report the discovery of new lanthanide and hafnium azainositol complexes and their optimization with respect to high water solubility and stability. Our efforts culminated in the synthesis of BAY-576, an uncharged hafnium complex with 3:2 stoichiometry and broken complex symmetry. The superior properties of this asymmetrically substituted hafnium CA were demonstrated by a CT angiography study in rabbits that revealed excellent signal contrast enhancement.

  19. Measuring the impact of different brands of computer systems on the clinical consultation: a pilot study

    Directory of Open Access Journals (Sweden)

    Charlotte Refsum

    2008-07-01

    Conclusion This methodological development improves the reliability of our method for measuring the impact of different computer systems on the GP consultation. UAR added more objectivity to the observationof doctor_computer interactions. If larger studies were to reproduce the differences between computer systems demonstrated in this pilot it might be possible to make objective comparisons between systems.

  20. Speed test results and hardware/software study of computational speed problem, appendix D

    Science.gov (United States)

    1984-01-01

    The HP9845C is a desktop computer which is tested and evaluated for processing speed. A study was made to determine the availability and approximate cost of computers and/or hardware accessories necessary to meet the 20 ms sample period speed requirements. Additional requirements were that the control algorithm could be programmed in a high language and that the machine have sufficient storage to store the data from a complete experiment.

  1. Computed tomography in children: multicenter cohort study design for the evaluation of cancer risk

    International Nuclear Information System (INIS)

    Krille, L.; Jahnen, A.; Mildenberger, P.; Schneider, K.; Weisser, G.; Zeeb, H.; Blettner, M.

    2011-01-01

    Exposure to ionizing radiation is a known risk factor for cancer. Cancer risk is highest after exposure in childhood. The computed tomography is the major contributor to the average, individual radiation exposure. Until now the association has been addressed only in statistical modeling. We present the first feasible study design on childhood cancer risk after exposure to computed tomography.

  2. Study on Production Management in Programming of Computer Numerical Control Machines

    Directory of Open Access Journals (Sweden)

    Gheorghe Popovici

    2014-12-01

    Full Text Available The paper presents the results of a study regarding the need for technology in programming for machinetools with computer-aided command. Engineering is the science of making skilled things. That is why, in the "factory of the future", programming engineering will have to realise the part processing on MU-CNCs (Computer Numerical Control Machines in the optimum economic variant. There is no "recipe" when it comes to technologies. In order to select the correct variant from among several technical variants, 10 technological requirements are forwarded for the engineer to take into account in MU-CNC programming. It is the first argued synthesis of the need for technological knowledge in MU-CNC programming.

  3. Gravity and magma induces spreading of Mount Etna volcano revealed by satellite radar interferometry

    Science.gov (United States)

    Lungren, P.; Casu, F.; Manzo, M.; Pepe, A.; Berardino, P.; Sansosti, E.; Lanari, R.

    2004-01-01

    Mount Etna underwent a cycle of eruptive activity over the past ten years. Here we compute ground displacement maps and deformation time series from more than 400 radar interferograms to reveal Mount Etna's average and time varying surface deformation from 1992 to 2001.

  4. Second-Language Composition Instruction, Computers and First-Language Pedagogy: A Descriptive Survey.

    Science.gov (United States)

    Harvey, T. Edward

    1987-01-01

    A national survey of full-time instructional faculty (N=208) at universities, 2-year colleges, and high schools regarding attitudes toward using computers in second-language composition instruction revealed a predomination of Apple and IBM-PC computers used, a major frustration in lack of foreign character support, and mixed opinions about real…

  5. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  6. Hispanic Women Overcoming Deterrents to Computer Science: A Phenomenological Study

    Science.gov (United States)

    Herling, Lourdes

    2011-01-01

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the…

  7. Biocatalysis of azidolysis of epoxides: Computational evidences on ...

    Indian Academy of Sciences (India)

    Active site model and crystal structure data reveal that the Tyr145 and Ser132 form weak hydrogen bonds with ... been computed using PCM model with water as solvent. (ε = 78.39). ... of p-nitro styrene oxide in HheC pocket (PDB ID: 1ZMT).

  8. Study of Material Flow of End-of-Life Computer Equipment (e-wastes ...

    African Journals Online (AJOL)

    In this study, a material flow model for the analysis of e-waste generation from computer equipment in Kaduna and Abuja in Nigeria has been developed and compared with that of Lagos which has been studied earlier. Data used to develop the models are the sales data from major distributors of electronics in the study ...

  9. User’s Emotions and Usability Study of a Brain-Computer Interface Applied to People with Cerebral Palsy

    Directory of Open Access Journals (Sweden)

    Alejandro Rafael García Ramírez

    2018-02-01

    Full Text Available People with motor and communication disorders face serious challenges in interacting with computers. To enhance this functionality, new human-computer interfaces are being studied. In this work, a brain-computer interface based on the Emotiv Epoc is used to analyze human-computer interactions in cases of cerebral palsy. The Phrase-Composer software was developed to interact with the brain-computer interface. A system usability evaluation was carried out with the participation of three specialists from The Fundação Catarinense de Educação especial (FCEE and four cerebral palsy volunteers. Even though the System Usability Scale (SUS score was acceptable, several challenges remain. Raw electroencephalography (EEG data were also analyzed in order to assess the user’s emotions during their interaction with the communication device. This study brings new evidences about human-computer interaction related to individuals with cerebral palsy.

  10. Conditions for Ubiquitous Computing: What Can Be Learned from a Longitudinal Study

    Science.gov (United States)

    Lei, Jing

    2010-01-01

    Based on survey data and interview data collected over four academic years, this longitudinal study examined how a ubiquitous computing project evolved along with the changes in teachers, students, the human infrastructure, and technology infrastructure in the school. This study also investigated what conditions were necessary for successful…

  11. The soft computing-based approach to investigate allergic diseases: a systematic review.

    Science.gov (United States)

    Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano

    2017-01-01

    Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.

  12. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  13. [Severe pulmonary embolism revealed by status epilepticus].

    Science.gov (United States)

    Allou, N; Coolen-Allou, N; Delmas, B; Cordier, C; Allyn, J

    2016-12-01

    High-risk pulmonary embolism (PE) is associated with high mortality rate (>50%). In some cases, diagnosis of PE remains a challenge with atypical presentations like in this case report with a PE revealed by status epilepticus. We report the case of a 40-year-old man without prior disease, hospitalized in ICU for status epilepticus. All paraclinical examinations at admission did not show any significant abnormalities (laboratory tests, cardiologic and neurological investigations). On day 1, he presented a sudden circulatory collapse and echocardiography showed right intra-auricular thrombus. He was treated by thrombolysis and arteriovenous extracorporeal membrane oxygenation. After stabilization, computed tomography showed severe bilateral PE. He developed multi-organ failure and died 4days after admission. Pulmonary embolism revealed by status epilepticus has rarely been reported and is associated with poor prognosis. Physicians should be aware and think of the possibility of PE in patients with status epilepticus without any history or risk factors of seizure and normal neurological investigations. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  14. Influence of using a single facial vein as outflow in full-face transplantation: A three-dimensional computed tomographic study.

    Science.gov (United States)

    Rodriguez-Lorenzo, Andres; Audolfsson, Thorir; Wong, Corrine; Cheng, Angela; Arbique, Gary; Nowinski, Daniel; Rozen, Shai

    2015-10-01

    The aim of this study was to evaluate the contribution of a single unilateral facial vein in the venous outflow of total-face allograft using three-dimensional computed tomographic imaging techniques to further elucidate the mechanisms of venous complications following total-face transplant. Full-face soft-tissue flaps were harvested from fresh adult human cadavers. A single facial vein was identified and injected distally to the submandibular gland with a radiopaque contrast (barium sulfate/gelatin mixture) in every specimen. Following vascular injections, three-dimensional computed tomographic venographies of the faces were performed. Images were viewed using TeraRecon Software (Teracon, Inc., San Mateo, CA, USA) allowing analysis of the venous anatomy and perfusion in different facial subunits by observing radiopaque filling venous patterns. Three-dimensional computed tomographic venographies demonstrated a venous network with different degrees of perfusion in subunits of the face in relation to the facial vein injection side: 100% of ipsilateral and contralateral forehead units, 100% of ipsilateral and 75% of contralateral periorbital units, 100% of ipsilateral and 25% of contralateral cheek units, 100% of ipsilateral and 75% of contralateral nose units, 100% of ipsilateral and 75% of contralateral upper lip units, 100% of ipsilateral and 25% of contralateral lower lip units, and 50% of ipsilateral and 25% of contralateral chin units. Venographies of the full-face grafts revealed better perfusion in the ipsilateral hemifaces from the facial vein in comparison with the contralateral hemifaces. Reduced perfusion was observed mostly in the contralateral cheek unit and contralateral lower face including the lower lip and chin units. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  15. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  16. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  18. X-Ray Computed Tomography Reveals the Response of Root System Architecture to Soil Texture1[OPEN

    Science.gov (United States)

    Rogers, Eric D.; Monaenkova, Daria; Mijar, Medhavinee; Goldman, Daniel I.

    2016-01-01

    Root system architecture (RSA) impacts plant fitness and crop yield by facilitating efficient nutrient and water uptake from the soil. A better understanding of the effects of soil on RSA could improve crop productivity by matching roots to their soil environment. We used x-ray computed tomography to perform a detailed three-dimensional quantification of changes in rice (Oryza sativa) RSA in response to the physical properties of a granular substrate. We characterized the RSA of eight rice cultivars in five different growth substrates and determined that RSA is the result of interactions between genotype and growth environment. We identified cultivar-specific changes in RSA in response to changing growth substrate texture. The cultivar Azucena exhibited low RSA plasticity in all growth substrates, whereas cultivar Bala root depth was a function of soil hardness. Our imaging techniques provide a framework to study RSA in different growth environments, the results of which can be used to improve root traits with agronomic potential. PMID:27208237

  19. Reproducing a Prospective Clinical Study as a Computational Retrospective Study in MIMIC-II.

    Science.gov (United States)

    Kury, Fabrício S P; Huser, Vojtech; Cimino, James J

    2015-01-01

    In this paper we sought to reproduce, as a computational retrospective study in an EHR database (MIMIC-II), a recent large prospective clinical study: the 2013 publication, by the Japanese Association for Acute Medicine (JAAM), about disseminated intravascular coagulation, in the journal Critical Care (PMID: 23787004). We designed in SQL and Java a set of electronic phenotypes that reproduced the study's data sampling, and used R to perform the same statistical inference procedures. All produced source code is available online at https://github.com/fabkury/paamia2015. Our program identified 2,257 eligible patients in MIMIC-II, and the results remarkably agreed with the prospective study. A minority of the needed data elements was not found in MIMIC-II, and statistically significant inferences were possible in the majority of the cases.

  20. Using Robotics and Game Design to Enhance Children's Self-Efficacy, STEM Attitudes, and Computational Thinking Skills

    Science.gov (United States)

    Leonard, Jacqueline; Buss, Alan; Gamboa, Ruben; Mitchell, Monica; Fashola, Olatokunbo S.; Hubert, Tarcia; Almughyirah, Sultan

    2016-12-01

    This paper describes the findings of a pilot study that used robotics and game design to develop middle school students' computational thinking strategies. One hundred and twenty-four students engaged in LEGO® EV3 robotics and created games using Scalable Game Design software. The results of the study revealed students' pre-post self-efficacy scores on the construct of computer use declined significantly, while the constructs of videogaming and computer gaming remained unchanged. When these constructs were analyzed by type of learning environment, self-efficacy on videogaming increased significantly in the combined robotics/gaming environment compared with the gaming-only context. Student attitudes toward STEM, however, did not change significantly as a result of the study. Finally, children's computational thinking (CT) strategies varied by method of instruction as students who participated in holistic game development (i.e., Project First) had higher CT ratings. This study contributes to the STEM education literature on the use of robotics and game design to influence self-efficacy in technology and CT, while informing the research team about the adaptations needed to ensure project fidelity during the remaining years of the study.

  1. The Cognitive Predictors of Computational Skill with Whole versus Rational Numbers: An Exploratory Study.

    Science.gov (United States)

    Seethaler, Pamela M; Fuchs, Lynn S; Star, Jon R; Bryant, Joan

    2011-10-01

    The purpose of the present study was to explore the 3(rd)-grade cognitive predictors of 5th-grade computational skill with rational numbers and how those are similar to and different from the cognitive predictors of whole-number computational skill. Students (n = 688) were assessed on incoming whole-number calculation skill, language, nonverbal reasoning, concept formation, processing speed, and working memory in the fall of 3(rd) grade. Students were followed longitudinally and assessed on calculation skill with whole numbers and with rational numbers in the spring of 5(th) grade. The unique predictors of skill with whole-number computation were incoming whole-number calculation skill, nonverbal reasoning, concept formation, and working memory (numerical executive control). In addition to these cognitive abilities, language emerged as a unique predictor of rational-number computational skill.

  2. Computer Games and Their Impact on Creativity of Primary Level Students in Tehran

    Directory of Open Access Journals (Sweden)

    Tahereh Mokhtari

    2016-09-01

    Full Text Available Creativity is about being sensitive to dilemmas, losses, problems, and existing errors, making propositions about and examining such issues, which finally leads to innovative findings. On the other hand, it seems that games are important in this process; since they can improve creativity of the individuals. Thus, this research pays attention to the question that whether computer games affect creativity of students at primary level in schools or not? Moreover, in this study, students of 3 main districts of Tehran municipality were studied. Based on the available data of the ministry, there were 51740 students studying in these three districts. Thus, 381 students were randomly selected as the research sample. Findings revealed that all computer games, i.e. puzzle, intellectual, and enigma, affect creativity of students at primary level in schools to different extents.

  3. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  4. Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide

    International Nuclear Information System (INIS)

    Altintas, S.

    1978-05-01

    The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10 6 obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement

  5. Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide

    Energy Technology Data Exchange (ETDEWEB)

    Altintas, S.

    1978-05-01

    The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10/sup 6/ obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement.

  6. Computer use and stress, sleep disturbances, and symptoms of depression among young adults--a prospective cohort study.

    Science.gov (United States)

    Thomée, Sara; Härenstam, Annika; Hagberg, Mats

    2012-10-22

    We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. The study group was a cohort of young adults (n = 4163), 20-24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU) in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs) were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases) at 1-year follow-up for the men and women separately. Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively) associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing sleep was associated with most mental health outcomes for both men

  7. Investigating the Status of Tablet Computers and E-Books Use of Open Education Faculty Students: A Case Study

    Science.gov (United States)

    Koçak, Ömer; Yildirim, Önder; Kursun, Engin; Yildirim, Gürkan

    2016-01-01

    The increase in tablet computers and e-books use brings to mind the question of how users benefited from these technologies. In this sense, the present study investigated the status of students' tablet computers and e-books use and the reasons why students prefer to use and not use of tablet computers and e-books. Students' study habits while…

  8. Toward accountable land use mapping: Using geocomputation to improve classification accuracy and reveal uncertainty

    NARCIS (Netherlands)

    Beekhuizen, J.; Clarke, K.C.

    2010-01-01

    The classification of satellite imagery into land use/cover maps is a major challenge in the field of remote sensing. This research aimed at improving the classification accuracy while also revealing uncertain areas by employing a geocomputational approach. We computed numerous land use maps by

  9. Computational chemistry reviews of current trends v.4

    CERN Document Server

    1999-01-01

    This volume presents a balanced blend of methodological and applied contributions. It supplements well the first three volumes of the series, revealing results of current research in computational chemistry. It also reviews the topographical features of several molecular scalar fields. A brief discussion of topographical concepts is followed by examples of their application to several branches of chemistry.The size of a basis set applied in a calculation determines the amount of computer resources necessary for a particular task. The details of a common strategy - the ab initio model potential

  10. A novel quantum solution to secure two-party distance computation

    Science.gov (United States)

    Peng, Zhen-wan; Shi, Run-hua; Wang, Pan-hong; Zhang, Shun

    2018-06-01

    Secure Two-Party Distance Computation is an important primitive of Secure Multiparty Computational Geometry that it involves two parties, where each party has a private point, and the two parties want to jointly compute the distance between their points without revealing anything about their respective private information. Secure Two-Party Distance Computation has very important and potential applications in settings of high secure requirements, such as privacy-preserving Determination of Spatial Location-Relation, Determination of Polygons Similarity, and so on. In this paper, we present a quantum protocol for Secure Two-Party Distance Computation by using QKD-based Quantum Private Query. The security of the protocol is based on the physical principles of quantum mechanics, instead of difficulty assumptions, and therefore, it can ensure higher security than the classical related protocols.

  11. What does patient feedback reveal about the NHS? A mixed methods study of comments posted to the NHS Choices online service

    Science.gov (United States)

    Brookes, Gavin; Baker, Paul

    2017-01-01

    Objective To examine the key themes of positive and negative feedback in patients’ online feedback on NHS (National Health Service) services in England and to understand the specific issues within these themes and how they drive positive and negative evaluation. Design Computer-assisted quantitative and qualitative studies of 228 113 comments (28 971 142 words) of online feedback posted to the NHS Choices website. Comments containing the most frequent positive and negative evaluative words are qualitatively examined to determine the key drivers of positive and negative feedback. Participants Contributors posting comments about the NHS between March 2013 and September 2015. Results Overall, NHS services were evaluated positively approximately three times more often than negatively. The four key areas of focus were: treatment, communication, interpersonal skills and system/organisation. Treatment exhibited the highest proportion of positive evaluative comments (87%), followed by communication (77%), interpersonal skills (44%) and, finally, system/organisation (41%). Qualitative analysis revealed that reference to staff interpersonal skills featured prominently, even in comments relating to treatment and system/organisational issues. Positive feedback was elicited in cases of staff being caring, compassionate and knowing patients’’ names, while rudeness, apathy and not listening were frequent drivers of negative feedback. Conclusions Although technical competence constitutes an undoubtedly fundamental aspect of healthcare provision, staff members were much more likely to be evaluated both positively and negatively according to their interpersonal skills. Therefore, the findings reported in this study highlight the salience of such ‘soft’ skills to patients and emphasise the need for these to be focused upon and developed in staff training programmes, as well as ensuring that decisions around NHS funding do not result in demotivated and rushed staff. The

  12. Cloud Computing (SaaS Adoption as a Strategic Technology: Results of an Empirical Study

    Directory of Open Access Journals (Sweden)

    Pedro R. Palos-Sanchez

    2017-01-01

    Full Text Available The present study empirically analyzes the factors that determine the adoption of cloud computing (SaaS model in firms where this strategy is considered strategic for executing their activity. A research model has been developed to evaluate the factors that influence the intention of using cloud computing that combines the variables found in the technology acceptance model (TAM with other external variables such as top management support, training, communication, organization size, and technological complexity. Data compiled from 150 companies in Andalusia (Spain are used to test the formulated hypotheses. The results of this study reflect what critical factors should be considered and how they are interrelated. They also show the organizational demands that must be considered by those companies wishing to implement a real management model adopted to the digital economy, especially those related to cloud computing.

  13. Studies on the zeros of Bessel functions and methods for their computation

    Science.gov (United States)

    Kerimov, M. K.

    2014-09-01

    The zeros of Bessel functions play an important role in computational mathematics, mathematical physics, and other areas of natural sciences. Studies addressing these zeros (their properties, computational methods) can be found in various sources. This paper offers a detailed overview of the results concerning the real zeros of the Bessel functions of the first and second kinds and general cylinder functions. The author intends to publish several overviews on this subject. In this first publication, works dealing with real zeros are analyzed. Primary emphasis is placed on classical results, which are still important. Some of the most recent publications are also discussed.

  14. IC3 Internet and Computing Core Certification Global Standard 4 study guide

    CERN Document Server

    Rusen, Ciprian Adrian

    2015-01-01

    Hands-on IC3 prep, with expert instruction and loads of tools IC3: Internet and Computing Core Certification Global Standard 4 Study Guide is the ideal all-in-one resource for those preparing to take the exam for the internationally-recognized IT computing fundamentals credential. Designed to help candidates pinpoint weak areas while there's still time to brush up, this book provides one hundred percent coverage of the exam objectives for all three modules of the IC3-GS4 exam. Readers will find clear, concise information, hands-on examples, and self-paced exercises that demonstrate how to per

  15. Indications for computed tomography (CT-) diagnostics in proximal humeral fractures: a comparative study of plain radiography and computed tomography

    OpenAIRE

    Weise Kuno; Pereira Philippe L; Dietz Klaus; Eingartner Christoph; Schmal Hagen; Südkamp Norbert P; Rolauffs Bernd; Bahrs Christian; Lingenfelter Erich; Helwig Peter

    2009-01-01

    Abstract Background Precise indications for computed tomography (CT) in proximal humeral fractures are not established. The purpose of this study was a comparison of conventional radiographic views with different CT reconstructions with 2 D and 3 D imaging to establish indications for additional CT diagnostics depending on the fractured parts. Methods In a prospective diagnostic study in two level 1 trauma centers, 44 patients with proximal humeral fractures were diagnosed with conventional X...

  16. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Xi Huang

    2018-01-01

    Full Text Available Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC: real-time communication that takes place between human beings via the instrumentality of computers in forms of text, audio and video communication, such as live chat and chatrooms as socially-oriented meaning construction. This review begins by considering the adoption of social interactionist views to identify key paradigms and supportive principles of computer-supported collaborative learning. A special focus on two components of communicative competence is then presented to explore interactional variables in synchronous computer-mediated communication along with a review of research. There follows a discussion on a synthesis of interactional variables in negotiated interaction and co-construction of knowledge from psycholinguistic and social cohesion perspectives. This review reveals both possibilities and disparities of language socialization in promoting intersubjective learning and diversifying the salient use of interactively creative language in computer-supported collaborative learning environments in service of communicative competence.

  17. Computational study on effects of rib height and thickness on heat ...

    Indian Academy of Sciences (India)

    A computational study was carried out for the heat transfer augmentation in a three-dimensional square channel fitted with different types of ribs. The standard k–e model and its two variants (RNG and realizable) were used for turbulence modeling. The predictions were compared with available experimental ...

  18. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  19. The Victorian State Computer Education Committee’s Seeding Pair In-Service Program: Two Case Studies

    OpenAIRE

    Keane , William ,

    2014-01-01

    International audience; Following the introduction of microcomputers into schools in the late 1970s, National Policy was developed which focused on the use of computers in non-computing subjects. The Victorian strategy for the implementation of the National Computers in Education Program was the development of a week-long in-service course which aimed to develop seeding pairs of teachers who would act as change agents when they returned to school. This chapter looks back at the case studies o...

  20. High Performance Computing and Storage Requirements for Biological and Environmental Research Target 2017

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Wasserman, Harvey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2013-05-01

    The National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,500 users working on some 650 projects that involve nearly 600 codes in a wide variety of scientific disciplines. In addition to large-­scale computing and storage resources NERSC provides support and expertise that help scientists make efficient use of its systems. The latest review revealed several key requirements, in addition to achieving its goal of characterizing BER computing and storage needs.