WorldWideScience

Sample records for subject terms computer

  1. Perceptual Computing Aiding People in Making Subjective Judgments

    CERN Document Server

    Mendel, Jerry

    2010-01-01

    Explains for the first time how "computing with words" can aid in making subjective judgments. Lotfi Zadeh, the father of fuzzy logic, coined the phrase "computing with words" (CWW) to describe a methodology in which the objects of computation are words and propositions drawn from a natural language. Perceptual Computing explains how to implement CWW to aid in the important area of making subjective judgments, using a methodology that leads to an interactive device—a "Perceptual Computer"—that propagates random and linguistic uncertainties into the subjective judg

  2. New Computer Terms in Bloggers’ Language

    Directory of Open Access Journals (Sweden)

    Vilija Celiešienė

    2012-06-01

    Full Text Available The article presents an analysis of new words in computer terminology that make their way to blogs and analyzes how the official neologisms and computer terms, especially the equivalents to barbarisms, are employed in everyday use. The article also discusses the ways of including the new computer terms into texts. The blogs on topics of information technology are the objects of the research. The analysis of the aforementioned blogs allowed highlighting certain trends in the use of new computer terms. An observation was made that even though the authors of the blogs could freely choose their writing style, they were not bound by the standards of literary language. Thus, their language was full of non-standard vocabulary; however, self-control regarding the language used could still be noticed. An interest in novelties of computer terminology and the tendency to accept some of the suggested new Lithuanian and loaned computer terms were noticed. When using the new words the bloggers frequently employed specific graphical elements and (or comments. The graphical elements were often chosen by bloggers to express their feelings of doubt regarding the suitability of the use of the suggested loanword. Attempting to explain the meaning of the new word to the readers the bloggers tended to post comments about the new computer terms.

  3. Computer task performance by subjects with Duchenne muscular dystrophy.

    Science.gov (United States)

    Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira

    2016-01-01

    Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.

  4. Sequenced subjective accents for brain-computer interfaces

    Science.gov (United States)

    Vlek, R. J.; Schaefer, R. S.; Gielen, C. C. A. M.; Farquhar, J. D. R.; Desain, P.

    2011-06-01

    Subjective accenting is a cognitive process in which identical auditory pulses at an isochronous rate turn into the percept of an accenting pattern. This process can be voluntarily controlled, making it a candidate for communication from human user to machine in a brain-computer interface (BCI) system. In this study we investigated whether subjective accenting is a feasible paradigm for BCI and how its time-structured nature can be exploited for optimal decoding from non-invasive EEG data. Ten subjects perceived and imagined different metric patterns (two-, three- and four-beat) superimposed on a steady metronome. With an offline classification paradigm, we classified imagined accented from non-accented beats on a single trial (0.5 s) level with an average accuracy of 60.4% over all subjects. We show that decoding of imagined accents is also possible with a classifier trained on perception data. Cyclic patterns of accents and non-accents were successfully decoded with a sequence classification algorithm. Classification performances were compared by means of bit rate. Performance in the best scenario translates into an average bit rate of 4.4 bits min-1 over subjects, which makes subjective accenting a promising paradigm for an online auditory BCI.

  5. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  6. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  7. The Observation of Bahasa Indonesia Official Computer Terms Implementation in Scientific Publication

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Lydia, M. S.; Muthaqin, M. I.

    2018-03-01

    The government of the Republic of Indonesia had issued a regulation to substitute computer terms in foreign language that have been used earlier into official computer terms in Bahasa Indonesia. This regulation was stipulated in Presidential Decree No. 2 of 2001 concerning the introduction of official computer terms in Bahasa Indonesia (known as Senarai Padanan Istilah/SPI). After sixteen years, people of Indonesia, particularly for academics, should have implemented the official computer terms in their official publications. This observation is conducted to discover the implementation of official computer terms usage in scientific publications which are written in Bahasa Indonesia. The data source used in this observation are the publications by the academics, particularly in computer science field. The method used in the observation is divided into four stages. The first stage is metadata harvesting by using Open Archive Initiative - Protocol for Metadata Harvesting (OAI-PMH). Second, converting the harvested document (in pdf format) to plain text. The third stage is text-preprocessing as the preparation of string matching. Then the final stage is searching the official computer terms based on 629 SPI terms by using Boyer-Moore algorithm. We observed that there are 240,781 foreign computer terms in 1,156 scientific publications from six universities. This result shows that the foreign computer terms are still widely used by the academics.

  8. Evaluation of a subject-specific, torque-driven computer simulation model of one-handed tennis backhand groundstrokes.

    Science.gov (United States)

    Kentel, Behzat B; King, Mark A; Mitchell, Sean R

    2011-11-01

    A torque-driven, subject-specific 3-D computer simulation model of the impact phase of one-handed tennis backhand strokes was evaluated by comparing performance and simulation results. Backhand strokes of an elite subject were recorded on an artificial tennis court. Over the 50-ms period after impact, good agreement was found with an overall RMS difference of 3.3° between matching simulation and performance in terms of joint and racket angles. Consistent with previous experimental research, the evaluation process showed that grip tightness and ball impact location are important factors that affect postimpact racket and arm kinematics. Associated with these factors, the model can be used for a better understanding of the eccentric contraction of the wrist extensors during one-handed backhand ground strokes, a hypothesized mechanism of tennis elbow.

  9. Exploration of Web Users' Search Interests through Automatic Subject Categorization of Query Terms.

    Science.gov (United States)

    Pu, Hsiao-tieh; Yang, Chyan; Chuang, Shui-Lung

    2001-01-01

    Proposes a mechanism that carefully integrates human and machine efforts to explore Web users' search interests. The approach consists of a four-step process: extraction of core terms; construction of subject taxonomy; automatic subject categorization of query terms; and observation of users' search interests. Research findings are proved valuable…

  10. Asymptotic optimality and efficient computation of the leave-subject-out cross-validation

    KAUST Repository

    Xu, Ganggang

    2012-12-01

    Although the leave-subject-out cross-validation (CV) has been widely used in practice for tuning parameter selection for various nonparametric and semiparametric models of longitudinal data, its theoretical property is unknown and solving the associated optimization problem is computationally expensive, especially when there are multiple tuning parameters. In this paper, by focusing on the penalized spline method, we show that the leave-subject-out CV is optimal in the sense that it is asymptotically equivalent to the empirical squared error loss function minimization. An efficient Newton-type algorithm is developed to compute the penalty parameters that optimize the CV criterion. Simulated and real data are used to demonstrate the effectiveness of the leave-subject-out CV in selecting both the penalty parameters and the working correlation matrix. © 2012 Institute of Mathematical Statistics.

  11. Asymptotic optimality and efficient computation of the leave-subject-out cross-validation

    KAUST Repository

    Xu, Ganggang; Huang, Jianhua Z.

    2012-01-01

    Although the leave-subject-out cross-validation (CV) has been widely used in practice for tuning parameter selection for various nonparametric and semiparametric models of longitudinal data, its theoretical property is unknown and solving the associated optimization problem is computationally expensive, especially when there are multiple tuning parameters. In this paper, by focusing on the penalized spline method, we show that the leave-subject-out CV is optimal in the sense that it is asymptotically equivalent to the empirical squared error loss function minimization. An efficient Newton-type algorithm is developed to compute the penalty parameters that optimize the CV criterion. Simulated and real data are used to demonstrate the effectiveness of the leave-subject-out CV in selecting both the penalty parameters and the working correlation matrix. © 2012 Institute of Mathematical Statistics.

  12. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104)

    International Nuclear Information System (INIS)

    Kress, T.S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time

  13. 37 CFR 1.710 - Patents subject to extension of the patent term.

    Science.gov (United States)

    2010-07-01

    ... of Patent Term Extension of Patent Term Due to Regulatory Review § 1.710 Patents subject to extension... primarily manufactured using recombinant DNA, recombinant RNA, hybridoma technology, or other processes...

  14. Experiences Using an Open Source Software Library to Teach Computer Vision Subjects

    Science.gov (United States)

    Cazorla, Miguel; Viejo, Diego

    2015-01-01

    Machine vision is an important subject in computer science and engineering degrees. For laboratory experimentation, it is desirable to have a complete and easy-to-use tool. In this work we present a Java library, oriented to teaching computer vision. We have designed and built the library from the scratch with emphasis on readability and…

  15. Short-term effects of playing computer games on attention.

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  16. A subject-independent pattern-based Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Andreas Markus Ray

    2015-10-01

    Full Text Available While earlier Brain-Computer Interface (BCI studies have mostly focused on modulating specific brain regions or signals, new developments in pattern classification of brain states are enabling real-time decoding and modulation of an entire functional network. The present study proposes a new method for real-time pattern classification and neurofeedback of brain states from electroencephalographic (EEG signals. It involves the creation of a fused classification model based on the method of Common Spatial Patterns (CSPs from data of several healthy individuals. The subject-independent model is then used to classify EEG data in real-time and provide feedback to new individuals. In a series of offline experiments involving training and testing of the classifier with individual data from 27 healthy subjects, a mean classification accuracy of 75.30% was achieved, demonstrating that the classification system at hand can reliably decode two types of imagery used in our experiments, i.e. happy emotional imagery and motor imagery. In a subsequent experiment it is shown that the classifier can be used to provide neurofeedback to new subjects, and that these subjects learn to match their brain pattern to that of the fused classification model in a few days of neurofeedback training. This finding can have important implications for future studies on neurofeedback and its clinical applications on neuropsychiatric disorders.

  17. Subject-specific computer simulation model for determining elbow loading in one-handed tennis backhand groundstrokes.

    Science.gov (United States)

    King, Mark A; Glynn, Jonathan A; Mitchell, Sean R

    2011-11-01

    A subject-specific angle-driven computer model of a tennis player, combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts, was developed to determine the effect of ball-racket impacts on loading at the elbow for one-handed backhand groundstrokes. Matching subject-specific computer simulations of a typical topspin/slice one-handed backhand groundstroke performed by an elite tennis player were done with root mean square differences between performance and matching simulations of elbow loading for a topspin and slice one-handed backhand groundstroke is relatively small. In this study, the relatively small differences in elbow loading may be due to comparable angle-time histories at the wrist and elbow joints with the major kinematic differences occurring at the shoulder. Using a subject-specific angle-driven computer model combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts allows peak internal loading, net impulse, and shock due to ball-racket impact to be calculated which would not otherwise be possible without impractical invasive techniques. This study provides a basis for further investigation of the factors that may increase elbow loading during tennis strokes.

  18. A study on shear behavior of R/C beams subjected to long-term heating

    International Nuclear Information System (INIS)

    Maruta, M.; Yamazaki, M.

    1993-01-01

    In nuclear power plants, many structural members are subjected to long term heating. There are few experimental data available on the behavior especially in shear of reinforced concrete (R/C) members subjected to long term heating. This paper describes a study aimed at experimentally determining the shear behavior of R/C members in nuclear power plant facilities following sustained heating at high temperatures

  19. Long-Term Outcome of Amyotrophic Lateral Sclerosis in Korean Subjects.

    Science.gov (United States)

    Suh, Mi Ri; Choi, Won Ah; Choi, Young-Chul; Lee, Jang Woo; Hong, Jung Hwa; Park, Jihyun; Kang, Seong-Woong

    2017-12-01

    To report the latest long-term outcome of amyotrophic lateral sclerosis (ALS) and to analyze the predictors of prognosis. Subjects who were diagnosed with ALS between January 2005 and December 2009 at a single institute were followed up until death or up to December 2014. Data regarding age, sex, date of onset, date of diagnosis, presence of bulbar symptoms on onset, date of initiation of non-invasive ventilation (NIV), and the date of tracheostomy were collected. Survival was assessed using Kaplan-Meier curves and multivariate analyses of the risk of death were performed using the Cox proportional hazards model. Among 212 suspicious subjects, definite ALS was diagnosed in 182 subjects. The survival rate at 3 and 5 years from onset was 61.5% and 40.1%, respectively, and the survival rate at 3 and 5 years post-diagnosis was 49.5% and 24.2%, respectively. Further, 134 patients (134/182, 73.6%) were initiated on NIV, and among them, 90 patients (90/182, 49.5%) underwent tracheostomy. Male gender and onset age of ≥65 years were independent predictors of adverse survival. The analysis of long term survival in ALS showed excellent outcomes considering the overall poor prognosis of this disease.

  20. The difference in subjective and objective complexity in the visual short-term memory

    DEFF Research Database (Denmark)

    Dall, Jonas Olsen; Sørensen, Thomas Alrik

    Several studies discuss the influence of complexity on the visual short term memory; some have demonstrated that short-term memory is surprisingly stable regardless of content (e.g. Luck & Vogel, 1997) where others have shown that memory can be influenced by the complexity of stimulus (e.g. Alvarez...... characters. On the contrary expertise or word frequency may reflect what could be termed subjective complexity, as this relate directly to the individual mental categories established. This study will be able to uncover more details on how we should define complexity of objects to be encoded into short-term....... & Cavanagh, 2004). But the term complexity is often not clearly defined. Sørensen (2008; see also Dall, Katsumi, & Sørensen, 2016) suggested that complexity can be related to two different types; objective and subjective complexity. This distinction is supported by a number of studies on the influence...

  1. Computer Science Education in Secondary Schools--The Introduction of a New Compulsory Subject

    Science.gov (United States)

    Hubwieser, Peter

    2012-01-01

    In 2004 the German state of Bavaria introduced a new compulsory subject of computer science (CS) in its grammar schools ("Gymnasium"). The subject is based on a comprehensive teaching concept that was developed by the author and his colleagues during the years 1995-2000. It comprises mandatory courses in grades 6/7 for all students of…

  2. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  3. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  4. Short-Term Effects of Playing Computer Games on Attention

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-01-01

    Objective: The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. Method: One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour.…

  5. Computer-based lesson planning for the subject of Islamic Culture History

    Directory of Open Access Journals (Sweden)

    Guntur Cahyono

    2017-08-01

    Full Text Available Abstract This study aims to describe the planning, implementation, evaluation, constraints and advantages of computer utilization in the History of Islamic Culture subject in MAN Salatiga. This research is a field study, aimed at studying intensively the background, current state, and environmental interaction of a social unit. Viewed from the type of data collected, this research is included in the category of qualitative research. Qualitative research is a research procedure that produces descriptive data in the form of written or spoken words from the people, or their behavior that can be observed. Subjects of this study are teachers and students. Data collection techniques used are observation, interviews, and documentation. Data analysis uses data reduction, data presentation, and conclusions. The results of this study indicate that the planning is done by classroom teachers by making lesson plan based on 2013 Curriculum guideline. Teachers have written the utilization of computers on the lesson plan components. Utilization of computers in the classroom is as a medium and the one in the computer laboratory as media and learning resources. Computers are used as a substitute for books due to their limited availability. It is also practiced as a habituation for preparing the computer-based national examination (UNBK.   Keywords: computer utilization, learning, history of Islamic culture subject   Abstrak Penelitian ini bertujuan untuk mendeskripsikan perencanaan, pelaksanaan, evaluasi, kendala dan kelebihan pemanfaatan komputer dalam pembelajaran mata pelajaran Sejarah Kebudayaan Islam di MAN Salatiga. Penelitian ini merupakan penelitian lapangan bertujuan mempelajari secara intensif latar belakang, keadaan sekarang, dan interaksi lingkungan suatu unit sosial. Jika dilihat dari jenis data yang dikumpulkan, maka penelitian ini termasuk dalam kategori penelitian kualitatif. Penelitian kualitatif sebagai prosedur penelitian yang menghasilkan

  6. Algorithmic psychometrics and the scalable subject.

    Science.gov (United States)

    Stark, Luke

    2018-04-01

    Recent public controversies, ranging from the 2014 Facebook 'emotional contagion' study to psychographic data profiling by Cambridge Analytica in the 2016 American presidential election, Brexit referendum and elsewhere, signal watershed moments in which the intersecting trajectories of psychology and computer science have become matters of public concern. The entangled history of these two fields grounds the application of applied psychological techniques to digital technologies, and an investment in applying calculability to human subjectivity. Today, a quantifiable psychological subject position has been translated, via 'big data' sets and algorithmic analysis, into a model subject amenable to classification through digital media platforms. I term this position the 'scalable subject', arguing it has been shaped and made legible by algorithmic psychometrics - a broad set of affordances in digital platforms shaped by psychology and the behavioral sciences. In describing the contours of this 'scalable subject', this paper highlights the urgent need for renewed attention from STS scholars on the psy sciences, and on a computational politics attentive to psychology, emotional expression, and sociality via digital media.

  7. Subject-based feature extraction by using fisher WPD-CSP in brain-computer interfaces.

    Science.gov (United States)

    Yang, Banghua; Li, Huarong; Wang, Qian; Zhang, Yunyuan

    2016-06-01

    Feature extraction of electroencephalogram (EEG) plays a vital role in brain-computer interfaces (BCIs). In recent years, common spatial pattern (CSP) has been proven to be an effective feature extraction method. However, the traditional CSP has disadvantages of requiring a lot of input channels and the lack of frequency information. In order to remedy the defects of CSP, wavelet packet decomposition (WPD) and CSP are combined to extract effective features. But WPD-CSP method considers less about extracting specific features that are fitted for the specific subject. So a subject-based feature extraction method using fisher WPD-CSP is proposed in this paper. The idea of proposed method is to adapt fisher WPD-CSP to each subject separately. It mainly includes the following six steps: (1) original EEG signals from all channels are decomposed into a series of sub-bands using WPD; (2) average power values of obtained sub-bands are computed; (3) the specified sub-bands with larger values of fisher distance according to average power are selected for that particular subject; (4) each selected sub-band is reconstructed to be regarded as a new EEG channel; (5) all new EEG channels are used as input of the CSP and a six-dimensional feature vector is obtained by the CSP. The subject-based feature extraction model is so formed; (6) the probabilistic neural network (PNN) is used as the classifier and the classification accuracy is obtained. Data from six subjects are processed by the subject-based fisher WPD-CSP, the non-subject-based fisher WPD-CSP and WPD-CSP, respectively. Compared with non-subject-based fisher WPD-CSP and WPD-CSP, the results show that the proposed method yields better performance (sensitivity: 88.7±0.9%, and specificity: 91±1%) and the classification accuracy from subject-based fisher WPD-CSP is increased by 6-12% and 14%, respectively. The proposed subject-based fisher WPD-CSP method can not only remedy disadvantages of CSP by WPD but also discriminate

  8. Enhanced inter-subject brain computer interface with associative sensorimotor oscillations.

    Science.gov (United States)

    Saha, Simanto; Ahmed, Khawza I; Mostafa, Raqibul; Khandoker, Ahsan H; Hadjileontiadis, Leontios

    2017-02-01

    Electroencephalography (EEG) captures electrophysiological signatures of cortical events from the scalp with high-dimensional electrode montages. Usually, excessive sources produce outliers and potentially affect the actual event related sources. Besides, EEG manifests inherent inter-subject variability of the brain dynamics, at the resting state and/or under the performance of task(s), caused probably due to the instantaneous fluctuation of psychophysiological states. A wavelet coherence (WC) analysis for optimally selecting associative inter-subject channels is proposed here and is being used to boost performances of motor imagery (MI)-based inter-subject brain computer interface (BCI). The underlying hypothesis is that optimally associative inter-subject channels can reduce the effects of outliers and, thus, eliminate dissimilar cortical patterns. The proposed approach has been tested on the dataset IVa from BCI competition III, including EEG data acquired from five healthy subjects who were given visual cues to perform 280 trials of MI for the right hand and right foot. Experimental results have shown increased classification accuracy (81.79%) using the WC-based selected 16 channels compared to the one (56.79%) achieved using all the available 118 channels. The associative channels lie mostly around the sensorimotor regions of the brain, reinforced by the previous literature, describing spatial brain dynamics during sensorimotor oscillations. Apparently, the proposed approach paves the way for optimised EEG channel selection that could boost further the efficiency and real-time performance of BCI systems.

  9. Short and long-term effects of sham-controlled prefrontal EEG-neurofeedback training in healthy subjects.

    Science.gov (United States)

    Engelbregt, H J; Keeser, D; van Eijk, L; Suiker, E M; Eichhorn, D; Karch, S; Deijen, J B; Pogarell, O

    2016-04-01

    In this study we evaluated long-term effects of frontal beta EEG-neurofeedback training (E-NFT) on healthy subjects. We hypothesized that E-NFT can change frontal beta activity in the long-term and that changes in frontal beta EEG activity are accompanied by altered cognitive performance. 25 healthy subjects were included and randomly assigned to active or sham E-NFT. On average the subjects underwent 15 E-NFT training sessions with a training duration of 45 min. Resting-state EEG was recorded prior to E-NFT training (t1) and in a 3-year follow-up (t3). Compared to sham E-NFT, which was used for the control group, real E-NFT increased beta activity in a predictable way. This increase was maintained over a period of three years post training. However, E-NFT did not result in significantly improved cognitive performance. Based on our results, we conclude that EEG-NFT can selectively modify EEG beta activity both in short and long-term. This is a sham controlled EEG neurofeedback study demonstrating long-term effects in resting state EEG. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  10. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  11. Specific terms glossary for subjects taught in Physical Culture first year career

    Directory of Open Access Journals (Sweden)

    Ana Isel Rodríguez Cruz

    2016-08-01

    Full Text Available Contents comprehension is an important element in the learning process; present didactic ways demand from teaching styles that favor communicative competence in the students. Taking into account the relevance of this topic in the teaching learning process it was decided to develop the present work, which has the objective to offer the students a tool that allow them an efficient comprehension of the contents they receive in the Physical Culture first year career subjects. To fulfil the goal a glossary with specific terms of basketball, chess, swimming, athletics, basic gymnastics, and morphology was designed starting from the results of the initial diagnosis, the scientific observation, as well as the detail revision of the normative documents that rule Communicative Spanish subject. The glossary use favor the students´ texts comprehension development from the mentioned subject.

  12. 25 CFR 900.33 - Are all proposals to renew term contracts subject to the declination criteria?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Are all proposals to renew term contracts subject to the... Indian Affairs will not review the renewal of a term contract for declination issues where no material... been proposed by the Indian tribe or tribal organization. Proposals to renew term contracts with DOI...

  13. Computer-assisted indexing for the INIS database

    International Nuclear Information System (INIS)

    Nevyjel, A.

    2006-01-01

    INIS has identified computer-assisted indexing as areas where information technology could assist best in maintaining database quality and indexing consistency, while containing production costs. Subject analysis is a very important but also very expensive process in the production of the INIS database. Given the current necessity to process an increased number of records, including subject analysis, without additional staff, INIS as well as the member states need improvements in their processing efficiency. Computer assisted subject analysis is a promising way to achieve this. The quality of the INIS database is defined by its inputting rules. The Thesaurus is a terminological control device used in translating from the natural language of documents, indexers or users into a more constrained system language. It is a controlled and dynamic vocabulary of semantically and generically related terms. It is the essential tool for subject analysis as well as for advanced search engines. To support the identification of descriptors in the free text (title, abstract, free keywords) 'hidden terms' have been introduced as extension of the Thesaurus, which identify phrases or character strings of free text and point to the valid descriptor, which should be suggested. In the process of computer-assisted subject analysis the bibliographic records (including title and abstract) are analyzed by the software, resulting in a list of suggested descriptors. Within the working platform (graphical user interface) the suggested descriptors are sorted by importance (by their relevance for the content of the document) and the subject specialist clearly sees the highlighted context from which the terms were selected. The system allows the subject specialist to accept or reject descriptors from the suggested list and to assign additional descriptors when necessary. First experiences show that a performance enhancement of about 80-100% can be achieved in the subject analysis process. (author)

  14. Does computer use pose a hazard for future long-term sickness absence?

    DEFF Research Database (Denmark)

    Andersen, Johan Hviid; Mikkelsen, Sigurd

    2010-01-01

    . The hazard ratio for sickness absence with weekly increase of one hour in computer use was 0.99 (95% CI: 0.99 to 1.00). Low satisfaction with work place arrangements and female gender both doubled the risk of sickness absence.We have earlier found that computer use did not predict persistent pain in the neck...... and upper limb, and it seems that computer use neither predicts future long-term sickness absence of all causes....

  15. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  16. An operational system for subject switching between controlled vocabularies: A computational linguistics approach

    Science.gov (United States)

    Silvester, J. P.; Newton, R.; Klingbiel, P. H.

    1984-01-01

    The NASA Lexical Dictionary (NLD), a system that automatically translates input subject terms to those of NASA, was developed in four phases. Phase One provided Phrase Matching, a context sensitive word-matching process that matches input phrase words with any NASA Thesaurus posting (i.e., index) term or Use reference. Other Use references have been added to enable the matching of synonyms, variant spellings, and some words with the same root. Phase Two provided the capability of translating any individual DTIC term to one or more NASA terms having the same meaning. Phase Three provided NASA terms having equivalent concepts for two or more DTIC terms, i.e., coordinations of DTIC terms. Phase Four was concerned with indexer feedback and maintenance. Although the original NLD construction involved much manual data entry, ways were found to automate nearly all but the intellectual decision-making processes. In addition to finding improved ways to construct a lexical dictionary, applications for the NLD have been found and are being developed.

  17. Increased motor cortex excitability during motor imagery in brain-computer interface trained subjects

    Science.gov (United States)

    Mokienko, Olesya A.; Chervyakov, Alexander V.; Kulikova, Sofia N.; Bobrov, Pavel D.; Chernikova, Liudmila A.; Frolov, Alexander A.; Piradov, Mikhail A.

    2013-01-01

    Background: Motor imagery (MI) is the mental performance of movement without muscle activity. It is generally accepted that MI and motor performance have similar physiological mechanisms. Purpose: To investigate the activity and excitability of cortical motor areas during MI in subjects who were previously trained with an MI-based brain-computer interface (BCI). Subjects and Methods: Eleven healthy volunteers without neurological impairments (mean age, 36 years; range: 24–68 years) were either trained with an MI-based BCI (BCI-trained, n = 5) or received no BCI training (n = 6, controls). Subjects imagined grasping in a blocked paradigm task with alternating rest and task periods. For evaluating the activity and excitability of cortical motor areas we used functional MRI and navigated transcranial magnetic stimulation (nTMS). Results: fMRI revealed activation in Brodmann areas 3 and 6, the cerebellum, and the thalamus during MI in all subjects. The primary motor cortex was activated only in BCI-trained subjects. The associative zones of activation were larger in non-trained subjects. During MI, motor evoked potentials recorded from two of the three targeted muscles were significantly higher only in BCI-trained subjects. The motor threshold decreased (median = 17%) during MI, which was also observed only in BCI-trained subjects. Conclusion: Previous BCI training increased motor cortex excitability during MI. These data may help to improve BCI applications, including rehabilitation of patients with cerebral palsy. PMID:24319425

  18. Increased motor cortex excitability during motor imagery in brain-computer interface trained subjects

    Directory of Open Access Journals (Sweden)

    Olesya eMokienko

    2013-11-01

    Full Text Available Background: Motor imagery (MI is the mental performance of movement without muscle activity. It is generally accepted that MI and motor performance have similar physiological mechanisms.Purpose: To investigate the activity and excitability of cortical motor areas during MI in subjects who were previously trained with an MI-based brain-computer interface (BCI.Subjects and methods: Eleven healthy volunteers without neurological impairments (mean age, 36 years; range: 24–68 years were either trained with an MI-based BCI (BCI-trained, n = 5 or received no BCI training (n = 6, controls. Subjects imagined grasping in a blocked paradigm task with alternating rest and task periods. For evaluating the activity and excitability of cortical motor areas we used functional MRI and navigated transcranial magnetic stimulation (nTMS.Results: fMRI revealed activation in Brodmann areas 3 and 6, the cerebellum, and the thalamus during MI in all subjects. The primary motor cortex was activated only in BCI-trained subjects. The associative zones of activation were larger in non-trained subjects. During MI, motor evoked potentials recorded from two of the three targeted muscles were significantly higher only in BCI-trained subjects. The motor threshold decreased (median = 17% during MI, which was also observed only in BCI-trained subjects.Conclusion: Previous BCI training increased motor cortex excitability during MI. These data may help to improve BCI applications, including rehabilitation of patients with cerebral palsy.

  19. Long-term functional, subjective and psychological results after single digit replantation

    Directory of Open Access Journals (Sweden)

    Jing Chen

    2018-03-01

    Full Text Available Objective: The aim of this study was to analyse the long-term functional, subjective, and psychological results after single-digit replantation. Methods: Thirty cases of digital replantation (14 thumbs, 12 index fingers, 2 middle fingers, 1 ring finger, and 1 little finger in 30 patients (7 females and 23 males with a mean age of 44.2 years (20–65 years were evaluated at the end of a mean follow-up time of 36 months (19–50 months. The active range of motion of joints, grip and pinch strength, cutaneous sensibility, upper-extremity functioning, and subjective satisfaction were determined using the Disability of Arm, Shoulder, and Hand (DASH questionnaire and the Michigan Hand Outcomes questionnaire (MHQ. Psychological sequelae, including depression, anxiety, and posttraumatic stress disorder (PTSD, were assessed. A correlation analysis among variables was also performed. Results: The mean score for the DASH questionnaire was 6.6 (range: 0–39.2. The symptom of cold intolerance occurred in 53% of the patients. Two patients were diagnosed with depression, and only one patient exhibited PTSD. The DASH score had a good statistical correlation with total grip strength, pinch grip strength, and static two-point discrimination (S-2PD (P < 0.05. Several aspects of the MHQ were also statistically relevant to some or all of the three objective results. Furthermore, the grip strength showed significant correlation with DASH and most aspects of the MHQ in multivariate logistic regression analysis (P < 0.05. Conclusion: Total grip strength is the most important factor positively related to subjective outcomes. The incidence rates of psychological symptoms after digit replantation are very low at long-term follow-up. Level of evidence: Level IV, therapeutic study. Keywords: Digit Replantation, DASH score, Posttraumatic stress disorder

  20. Long-term functional, subjective and psychological results after single digit replantation.

    Science.gov (United States)

    Chen, Jing; Zhang, Ai Xian; Chen, Qing Zhong; Mu, Shuai; Tan, Jun

    2018-03-01

    The aim of this study was to analyse the long-term functional, subjective, and psychological results after single-digit replantation. Thirty cases of digital replantation (14 thumbs, 12 index fingers, 2 middle fingers, 1 ring finger, and 1 little finger) in 30 patients (7 females and 23 males) with a mean age of 44.2 years (20-65 years) were evaluated at the end of a mean follow-up time of 36 months (19-50 months). The active range of motion of joints, grip and pinch strength, cutaneous sensibility, upper-extremity functioning, and subjective satisfaction were determined using the Disability of Arm, Shoulder, and Hand (DASH) questionnaire and the Michigan Hand Outcomes questionnaire (MHQ). Psychological sequelae, including depression, anxiety, and posttraumatic stress disorder (PTSD), were assessed. A correlation analysis among variables was also performed. The mean score for the DASH questionnaire was 6.6 (range: 0-39.2). The symptom of cold intolerance occurred in 53% of the patients. Two patients were diagnosed with depression, and only one patient exhibited PTSD. The DASH score had a good statistical correlation with total grip strength, pinch grip strength, and static two-point discrimination (S-2PD) (P digit replantation are very low at long-term follow-up. Level IV, therapeutic study. Copyright © 2017 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.

  1. Correlation between Very Short and Short-Term Blood Pressure Variability in Diabetic-Hypertensive and Healthy Subjects.

    Science.gov (United States)

    Casali, Karina R; Schaan, Beatriz D; Montano, Nicola; Massierer, Daniela; M F Neto, Flávio; Teló, Gabriela H; Ledur, Priscila S; Reinheimer, Marilia; Sbruzzi, Graciele; Gus, Miguel

    2018-02-01

    Blood pressure (BP) variability can be evaluated by 24-hour ambulatory BP monitoring (24h-ABPM), but its concordance with results from finger BP measurement (FBPM) has not been established yet. The aim of this study was to compare parameters of short-term (24h-ABPM) with very short-term BP variability (FBPM) in healthy (C) and diabetic-hypertensive (DH) subjects. Cross-sectional study with 51 DH subjects and 12 C subjects who underwent 24h-ABPM [extracting time-rate, standard deviation (SD), coefficient of variation (CV)] and short-term beat-to-beat recording at rest and after standing-up maneuvers [FBPM, extracting BP and heart rate (HR) variability parameters in the frequency domain, autoregressive spectral analysis]. Spearman correlation coefficient was used to correlate BP and HR variability parameters obtained from both FBPM and 24h-ABPM (divided into daytime, nighttime, and total). Statistical significance was set at p ABPM) and LF component of short-term variability (FBPM, total, R = 0.591, p = 0.043); standard deviation (24h-ABPM) with LF component BPV (FBPM, total, R = 0.608, p = 0.036), coefficient of variation (24h-ABPM) with total BPV (FBPM, daytime, -0.585, p = 0.046) and alpha index (FBPM, daytime, -0.592, p = 0.043), time rate (24h-ABPM) and delta LF/HF (FBPM, total, R = 0.636, p = 0.026; daytime R = 0,857, p ABPM (total, daytime) reflect BP and HR variability evaluated by FBPM in healthy individuals. This does not apply for DH subjects.

  2. Performance evaluation of a motor-imagery-based EEG-Brain computer interface using a combined cue with heterogeneous training data in BCI-Naive subjects

    Directory of Open Access Journals (Sweden)

    Lee Youngbum

    2011-10-01

    Full Text Available Abstract Background The subjects in EEG-Brain computer interface (BCI system experience difficulties when attempting to obtain the consistent performance of the actual movement by motor imagery alone. It is necessary to find the optimal conditions and stimuli combinations that affect the performance factors of the EEG-BCI system to guarantee equipment safety and trust through the performance evaluation of using motor imagery characteristics that can be utilized in the EEG-BCI testing environment. Methods The experiment was carried out with 10 experienced subjects and 32 naive subjects on an EEG-BCI system. There were 3 experiments: The experienced homogeneous experiment, the naive homogeneous experiment and the naive heterogeneous experiment. Each experiment was compared in terms of the six audio-visual cue combinations and consisted of 50 trials. The EEG data was classified using the least square linear classifier in case of the naive subjects through the common spatial pattern filter. The accuracy was calculated using the training and test data set. The p-value of the accuracy was obtained through the statistical significance test. Results In the case in which a naive subject was trained by a heterogeneous combined cue and tested by a visual cue, the result was not only the highest accuracy (p Conclusions We propose the use of this measuring methodology of a heterogeneous combined cue for training data and a visual cue for test data by the typical EEG-BCI algorithm on the EEG-BCI system to achieve effectiveness in terms of consistence, stability, cost, time, and resources management without the need for a trial and error process.

  3. Decisions in Motion: Decision Dynamics during Intertemporal Choice reflect Subjective Evaluation of Delayed Rewards

    Science.gov (United States)

    O'Hora, Denis; Carey, Rachel; Kervick, Aoife; Crowley, David; Dabrowski, Maciej

    2016-02-01

    People tend to discount rewards or losses that occur in the future. Such delay discounting has been linked to many behavioral and health problems, since people choose smaller short-term gains over greater long-term gains. We investigated whether the effect of delays on the subjective value of rewards is expressed in how people move when they make choices. Over 600 patrons of the RISK LAB exhibition hosted by the Science Gallery DublinTM played a short computer game in which they used a computer mouse to choose between amounts of money at various delays. Typical discounting effects were observed and decision dynamics indicated that choosing smaller short-term rewards became easier (i.e., shorter response times, tighter trajectories, less vacillation) as the delays until later rewards increased. Based on a sequence of choices, subjective values of delayed outcomes were estimated and decision dynamics during initial choices predicted these values. Decision dynamics are affected by subjective values of available options and thus provide a means to estimate such values.

  4. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  5. Short-Term Effects of Electroconvulsive Therapy on Subjective and Actigraphy-Assessed Sleep Parameters in Severely Depressed Inpatients

    Directory of Open Access Journals (Sweden)

    Alexander Hoogerhoud

    2015-01-01

    Full Text Available Background. Sleep disturbances are a key feature of major depression. Electroconvulsive treatment (ECT may improve polysomnography-assessed sleep characteristics, but its short-term effects on actigraphy-assessed and subjective sleep characteristics are unknown. We therefore aimed to assess the effects of ECT on subjective and objective sleep parameters in a proof-of-principle study. Methods. We assessed subjective and objective sleep parameters in 12 severely depressed patients up to 5 consecutive days during their ECT course, corresponding to a total of 43 nights (including 19 ECT sessions. The 12 patients were 83% female and on average 62 (standard deviation (SD 14 years old and had an average MADRS score of 40 at baseline (SD 21. Results. Subjective and objective sleep parameters were not directly affected by ECT. The subjective sleep efficiency parameter was similar on the day after ECT and other days. ECT did not affect the number of errors in the Sustained Attention to Response Task. Patients subjectively underestimated their total sleep time by 1.4 hours (P<0.001 compared to actigraphy-assessed sleep duration. Conclusion. ECT did not affect subjective and actigraphy-assessed sleep in the short term. Depressed patients profoundly underestimated their sleep duration.

  6. Long-Term Quality of Life Improvement in Subjects with Healed Erosive Esophagitis: Treatment with Lansoprazole

    Science.gov (United States)

    Freston, James W.; Haber, Marian M.; Atkinson, Stuart; Hunt, Barbara; Peura, David A.

    2009-01-01

    Background Gastroesophageal reflux disease (GERD) is a chronic symptomatic condition and may be associated with erosive esophagitis (EE). Considerable data on the long-term maintenance of healing of EE are available, but data on long-term GERD symptom prevention and patient quality of life (QOL) are limited. Aims To investigate QOL in subjects with healed EE who received 12 months of double-blind maintenance treatment with lansoprazole or ranitidine, followed by long-term open-label lansoprazole therapy to prevent recurrence of EE. Methods Subjects with healed EE received 12 months of double-blind maintenance treatment with lansoprazole 15 mg once daily or ranitidine 150 mg twice daily, followed by dose-titrated, open-label lansoprazole therapy for up to 82 months. Results During double-blind treatment (n = 206), lansoprazole-treated patients showed significantly (P ≤ 0.05) greater improvements than ranitidine-treated patients in the frequency, severity, and ‘bothersomeness’ of heartburn, the symptom index, problems of activity limitation, eating and drinking problems, symptom problems, health distress, and social functioning. During dose-titrated, open-label treatment (n = 195), all disease-specific QOL scales except sleep improved significantly (P lansoprazole for 12 months in healed EE subjects produced significantly greater improvements in QOL indicators than ranitidine. These improvements were sustained during dose-titrated, open-label lansoprazole treatment. PMID:19582579

  7. Short-term effects of implemented high intensity shoulder elevation during computer work

    DEFF Research Database (Denmark)

    Larsen, Mette K.; Samani, Afshin; Madeleine, Pascal

    2009-01-01

    computer work to prevent neck-shoulder pain may be possible without affecting the working routines. However, the unexpected reduction in clavicular trapezius rest during a pause with preceding high intensity contraction requires further investigation before high intensity shoulder elevations can......BACKGROUND: Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary...... contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction...

  8. Population of 224 realistic human subject-based computational breast phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, David W. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Wells, Jered R., E-mail: jered.wells@duke.edu [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Sturgeon, Gregory M. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Dobbins, James T. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Segars, W. Paul [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Electrical and Computer Engineering and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-01-15

    Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

  9. Computer codes for simulating atomic-displacement cascades in solids subject to irradiation

    International Nuclear Information System (INIS)

    Asaoka, Takumi; Taji, Yukichi; Tsutsui, Tsuneo; Nakagawa, Masayuki; Nishida, Takahiko

    1979-03-01

    In order to study atomic displacement cascades originating from primary knock-on atoms in solids subject to incident radiation, the simulation code CASCADE/CLUSTER is adapted for use on FACOM/230-75 computer system. In addition, the code is modified so as to plot the defect patterns in crystalline solids. As other simulation code of the cascade process, MARLOWE is also available for use on the FACOM system. To deal with the thermal annealing of point defects produced in the cascade process, the code DAIQUIRI developed originally for body-centered cubic crystals is modified to be applicable also for face-centered cubic lattices. By combining CASCADE/CLUSTER and DAIQUIRI, we then prepared a computer code system CASCSRB to deal with heavy irradiation or saturation damage state of solids at normal temperature. Furthermore, a code system for the simulation of heavy irradiations CASCMARL is available, in which MARLOWE code is substituted for CASCADE in the CASCSRB system. (author)

  10. Medium-term energy hub management subject to electricity price and wind uncertainty

    International Nuclear Information System (INIS)

    Najafi, Arsalan; Falaghi, Hamid; Contreras, Javier; Ramezani, Maryam

    2016-01-01

    Highlights: • A new model for medium-term energy hub management is proposed. • Risk aversion is considered in medium-term energy hub management. • Stochastic programing is used to solve the medium-term energy hub management problem. • Electricity price and wind uncertainty are considered. - Abstract: Energy hubs play an important role in implementing multi-carrier energy systems. More studies are required in both their modeling and operating aspects. In this regard, this paper attempts to develop medium-term management of an energy hub in restructured power systems. A model is presented to manage an energy hub which has electrical energy and natural gas as inputs and electrical and heat energy as outputs. Electricity is procured in various ways, either purchasing it from a pool-based market and bilateral contracts, or producing it from a Combined Heat and Power (CHP) unit, a diesel generator unit and Wind Turbine Generators (WTGs). Pool prices and wind turbine production are subject to uncertainty, which makes energy management a complex puzzle. Heat demand is also procured by a furnace and a CHP unit. Energy hub managers should make decisions whether to purchase electricity from the electricity market and gas from the gas network or to produce electricity using a set of generators to meet the electrical and heat demands in the presence of uncertainties. The energy management objective is to minimize the total cost subject to several technical constraints using stochastic programming. Conditional Value at Risk (CVaR), a well-known risk measure, is used to reduce the unfavorable risk of costs. In doing so, the proposed model is illustrated using a sample test case with actual prices, load and wind speed data. The results show that the minimum cost is obtained by the best decisions involving the electricity market and purchasing natural gas for gas facilities. Considering risk also increases the total expected cost and decreases the CVaR.

  11. Long-term clearance from small airways in subjects with ciliary dysfunction

    Directory of Open Access Journals (Sweden)

    Hjelte Lena

    2006-05-01

    Full Text Available Abstract The objective of this study was to investigate if long-term clearance from small airways is dependent on normal ciliary function. Six young adults with primary ciliary dyskinesia (PCD inhaled 111 Indium labelled Teflon particles of 4.2 μm geometric and 6.2 μm aerodynamic diameter with an extremely slow inhalation flow, 0.05 L/s. The inhalation method deposits particles mainly in the small conducting airways. Lung retention was measured immediately after inhalation and at four occasions up to 21 days after inhalation. Results were compared with data from ten healthy controls. For additional comparison three of the PCD subjects also inhaled the test particles with normal inhalation flow, 0.5 L/s, providing a more central deposition. The lung retention at 24 h in % of lung deposition (Ret24 was higher (p 24 with slow inhalation flow was 73.9 ± 1.9 % compared to 68.9 ± 7.5 % with normal inhalation flow in the three PCD subjects exposed twice. During day 7–21 the three PCD subjects exposed twice cleared 9 % with normal flow, probably representing predominantly alveolar clearance, compared to 19 % with slow inhalation flow, probably representing mainly small airway clearance. This study shows that despite ciliary dysfunction, clearance continues in the small airways beyond 24 h. There are apparently additional clearance mechanisms present in the small airways.

  12. Does computer use pose a hazard for future long-term sickness absence?

    DEFF Research Database (Denmark)

    Andersen, JH; Mikkelsen, Sigurd

    2010-01-01

    The aim of the study was to investigate if weekly duration of computer use predicted sickness absence for more than two weeks at a later time.A cohort of 2146 frequent computer users filled in a questionnaire at baseline and was followed for one year with continuously recording of the duration of...... and upper limb, and it seems that computer use neither predicts future long-term sickness absence of all causes.......The aim of the study was to investigate if weekly duration of computer use predicted sickness absence for more than two weeks at a later time.A cohort of 2146 frequent computer users filled in a questionnaire at baseline and was followed for one year with continuously recording of the duration...... of computer use and furthermore followed for 300 weeks in a central register of sickness absence for more than 2 weeks.147 participants of the 2,146 (6.9%) became first time sick listed in the follow-up period. Overall, mean weekly computer use did not turn out to be a risk factor for later sickness absence...

  13. A report on evaluation of research and development subjects in fiscal year 2001. Evaluation subject on the 'Middle- and long-term business program'

    International Nuclear Information System (INIS)

    2001-09-01

    The middle- and long-term business program determined by the Japan Nuclear Cycle Development Institute (JNC) is for elucidation of middle- and long-term targets to be expanded by JNC and is a base to promote individual R and D. This program is to be revised at a chance established on new long-term plan on research, development and application of nuclear energy on November, 2000 by the Committee of Atomic Energy under consideration of condition change after March, 1999. This report is a summary of evaluation results on the present middle- and long-term business program established by JNC, especially at a center of its revised portion, as a form of opinion. The evaluated results are described on two forms of the subject evaluation committees on the fast reactor and fuel cycle and on the wastes processing and disposal. (G.K.)

  14. Pedagogical Factors Stimulating the Self-Development of Students' Multi-Dimensional Thinking in Terms of Subject-Oriented Teaching

    Science.gov (United States)

    Andreev, Valentin I.

    2014-01-01

    The main aim of this research is to disclose the essence of students' multi-dimensional thinking, also to reveal the rating of factors which stimulate the raising of effectiveness of self-development of students' multi-dimensional thinking in terms of subject-oriented teaching. Subject-oriented learning is characterized as a type of learning where…

  15. Acute, subacute and long-term subjective effects of psilocybin in healthy humans: a pooled analysis of experimental studies.

    Science.gov (United States)

    Studerus, Erich; Kometer, Michael; Hasler, Felix; Vollenweider, Franz X

    2011-11-01

    Psilocybin and related hallucinogenic compounds are increasingly used in human research. However, due to limited information about potential subjective side effects, the controlled medical use of these compounds has remained controversial. We therefore analysed acute, short- and long-term subjective effects of psilocybin in healthy humans by pooling raw data from eight double-blind placebo-controlled experimental studies conducted between 1999 and 2008. The analysis included 110 healthy subjects who had received 1-4 oral doses of psilocybin (45-315 µg/kg body weight). Although psilocybin dose-dependently induced profound changes in mood, perception, thought and self-experience, most subjects described the experience as pleasurable, enriching and non-threatening. Acute adverse drug reactions, characterized by strong dysphoria and/or anxiety/panic, occurred only in the two highest dose conditions in a relatively small proportion of subjects. All acute adverse drug reactions were successfully managed by providing interpersonal support and did not need psychopharmacological intervention. Follow-up questionnaires indicated no subsequent drug abuse, persisting perception disorders, prolonged psychosis or other long-term impairment of functioning in any of our subjects. The results suggest that the administration of moderate doses of psilocybin to healthy, high-functioning and well-prepared subjects in the context of a carefully monitored research environment is associated with an acceptable level of risk.

  16. [Efficiency of computer-based documentation in long-term care--preliminary project].

    Science.gov (United States)

    Lüngen, Markus; Gerber, Andreas; Rupprecht, Christoph; Lauterbach, Karl W

    2008-06-01

    In Germany the documentation of processes in long-term care is mainly paper-based. Planning, realization and evaluation are not supported in an optimal way. In a preliminary study we evaluated the consequences of the introduction of a computer-based documentation system using handheld devices. We interviewed 16 persons before and after introducing the computer-based documentation and assessed costs for the documentation process and administration. The results show that reducing costs is likely. The job satisfaction of the personnel increased, more time could be spent for caring for the residents. We suggest further research to reach conclusive results.

  17. The Evaluation of CEIT Teacher Candidates in Terms of Computer Games, Educational Use of Computer Games and Game Design Qualifications

    Directory of Open Access Journals (Sweden)

    Hakkı BAĞCI

    2014-04-01

    Full Text Available Computer games have an important usage potential in the education of today’s digital student profile. Also computer teachers known as technology leaders in schools are the main stakeholders of this potential. In this study, opinions of the computer teachers about computer games are examined from different perspectives. 119 computer teacher candidates participated in this study, and the data were collected by a questionnaire. As a result of this study, computer teacher candidates have a positive thinking about the usage of computer games in education and they see themselves qualified for the analysis and design of educational games. But they partially have negative attitudes about some risks like addiction and lose of time. Also the candidates who attended the educational game courses and play games from their mobile phones have more positive opinions, and they see themselves more qualified than others. Males have more positive opinions about computer games than females, but in terms of educational games and the analysis and design of the computer games, there is no significant difference between males and females.

  18. High-resolution subject-specific mitral valve imaging and modeling: experimental and computational methods.

    Science.gov (United States)

    Toma, Milan; Bloodworth, Charles H; Einstein, Daniel R; Pierce, Eric L; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S

    2016-12-01

    The diversity of mitral valve (MV) geometries and multitude of surgical options for correction of MV diseases necessitates the use of computational modeling. Numerical simulations of the MV would allow surgeons and engineers to evaluate repairs, devices, procedures, and concepts before performing them and before moving on to more costly testing modalities. Constructing, tuning, and validating these models rely upon extensive in vitro characterization of valve structure, function, and response to change due to diseases. Micro-computed tomography ([Formula: see text]CT) allows for unmatched spatial resolution for soft tissue imaging. However, it is still technically challenging to obtain an accurate geometry of the diastolic MV. We discuss here the development of a novel technique for treating MV specimens with glutaraldehyde fixative in order to minimize geometric distortions in preparation for [Formula: see text]CT scanning. The technique provides a resulting MV geometry which is significantly more detailed in chordal structure, accurate in leaflet shape, and closer to its physiological diastolic geometry. In this paper, computational fluid-structure interaction (FSI) simulations are used to show the importance of more detailed subject-specific MV geometry with 3D chordal structure to simulate a proper closure validated against [Formula: see text]CT images of the closed valve. Two computational models, before and after use of the aforementioned technique, are used to simulate closure of the MV.

  19. LONG TERM EFFECT OF CYRIAX PHYSIOTHERPY WITH SUPERVISED EXERCISE PROGRAM IN SUBJECTS WITH TENNIS ELBOW

    Directory of Open Access Journals (Sweden)

    Pallavi Shridhar Thakare

    2014-06-01

    Full Text Available Background: The purpose is to find long term effect of Cyriax physiotherapy with supervised exercise program in the reduction of pain and improvement of functional ability for subjects with tennis elbow. Method: An experimental study design, 30 subjects with Tennis Elbow randomized 15 subjects each into Study and Control group. Control group received Supervised Exercise program while Study group received Cyriax Physiotherapy with Supervised exercises program thrice in a week for 4 weeks and post intervention follow up after 2 weeks. Outcome measurements were measured for pain using Visual analogue Scale (VAS and Patient Rated Tennis Elbow Evaluation (PRTEE for functional ability. Results: There is no statistically significant difference in pre- intervention means of VAS and PRTEE when compared between the groups using independent ‘t’ test as a parametric and Mann Whitney U test as a non-parametric test. When means of post intervention and follow-up measurements were compared there is a statistically significant (p<0.05 difference in VAS and PRTEE scores between the groups. However greater percentage of improvements was obtained in study group than control group. Conclusion: It is concluded that there is significant long term effect with greater percentage of improvement in pain and functional ability up to 2 weeks follow-up following 4 weeks of combined Cyriax physiotherapy with supervised exercise program than only supervised exercise program for subjects with tennis elbow.

  20. Computer-assisted analysis of cervical vertebral bone age using cephalometric radiographs in Brazilian subjects.

    Science.gov (United States)

    Caldas, Maria de Paula; Ambrosano, Gláucia Maria Bovi; Haiter Neto, Francisco

    2010-01-01

    The aims of this study were to develop a computerized program for objectively evaluating skeletal maturation on cephalometric radiographs, and to apply the new method to Brazilian subjects. The samples were taken from the patient files of Oral Radiological Clinics from the North, Northeast, Midwest and South regions of the country. A total of 717 subjects aged 7.0 to 15.9 years who had lateral cephalometric radiographs and hand-wrist radiographs were selected. A cervical vertebral computerized analysis was created in the Radiocef Studio 2 computer software for digital cephalometric analysis, and cervical vertebral bone age was calculated using the formulas developed by Caldas et al.17 (2007). Hand-wrist bone age was evaluated by the TW3 method. Analysis of variance (ANOVA) and the Tukey test were used to compare cervical vertebral bone age, hand-wrist bone age and chronological age (P cervical vertebral bone age and chronological age in all regions studied. When analyzing bone age, it was possible to observe a statistically significant difference between cervical vertebral bone age and hand-wrist bone age for female and male subjects in the North and Northeast regions, as well as for male subjects in the Midwest region. No significant difference was observed between bone age and chronological age in all regions except for male subjects in the North and female subjects in the Northeast. Using cervical vertebral bone age, it might be possible to evaluate skeletal maturation in an objective manner using cephalometric radiographs.

  1. Virtual materiality, potentiality and gendered subjectivity

    DEFF Research Database (Denmark)

    Søndergaard, Dorte Marie

    How do we conceptualize virtual materiality, in terms of for instance avatars and weapons in computer games, virtual discourse, subjectivity and the enactment of masculinity as phenomena intra-acting with real life materiality, discourse, subjectivity and masculinity in children’s everyday lives......? How do we understand the intra-activity of such elements in children’s night dreams? These are some of the questions discussed in this paper. I bring together Karen Barad’s agential realism and Giorgi Agamben’s concept of potentiality to enable and refine an analytical approach to real......-virtual enactments, thereby questioning the potentialities of gaming, of movies and of dreams as they enter intra-activities in the comprehensive set of apparatuses that enact gendered agency and relational practices. The analyses and conceptual refinements are based on empirical cases involving interviews...

  2. Long-term stress distribution patterns of the ankle joint in varus knee alignment assessed by computed tomography osteoabsorptiometry.

    Science.gov (United States)

    Onodera, Tomohiro; Majima, Tokifumi; Iwasaki, Norimasa; Kamishima, Tamotsu; Kasahara, Yasuhiko; Minami, Akio

    2012-09-01

    The stress distribution of an ankle under various physiological conditions is important for long-term survival of total ankle arthroplasty. The aim of this study was to measure subchondral bone density across the distal tibial joint surface in patients with malalignment/instability of the lower limb. We evaluated subchondral bone density across the distal tibial joint in patients with malalignment/instability of the knee by computed tomography (CT) osteoabsorptiometry from ten ankles as controls and from 27 ankles with varus deformity/instability of the knee. The quantitative analysis focused on the location of the high-density area at the articular surface, to determine the resultant long-term stress on the ankle joint. The area of maximum density of subchondral bone was located in the medial part in all subjects. The pattern of maximum density in the anterolateral area showed stepwise increases with the development of varus deformity/instability of the knee. Our results should prove helpful for designing new prostheses and determining clinical indications for total ankle arthroplasty.

  3. Patterns of similarity and difference between the vocabularies of psychology and other subjects.

    Science.gov (United States)

    Benjafield, John G

    2014-02-01

    The vocabulary of Anglophone psychology is shared with many other subjects. Previous research using the Oxford English Dictionary has shown that the subjects having the most words in common with psychology are biology, chemistry, computing, electricity, law, linguistics, mathematics, medicine, music, pathology, philosophy, and physics. The present study presents a database of the vocabularies of these 12 subjects that is similar to one previously constructed for psychology, enabling the histories of the vocabularies of these subjects to be compared with each other as well as with psychology. All subjects have a majority of word senses that are metaphorical. However, psychology is not among the most metaphorical of subjects, a distinction belonging to computing, linguistics, and mathematics. Indeed, the history of other subjects shows an increasing tendency to recycle old words and give them new, metaphorical meanings. The history of psychology shows an increasing tendency to invent new words rather than metaphorical senses of existing words. These results were discussed in terms of the degree to which psychology's vocabulary remains unsettled in comparison with other subjects. The possibility was raised that the vocabulary of psychology is in a state similar to that of chemistry prior to Lavoisier.

  4. Computer-assisted analysis of cervical vertebral bone age using cephalometric radiographs in Brazilian subjects

    Directory of Open Access Journals (Sweden)

    Maria de Paula Caldas

    2010-03-01

    Full Text Available The aims of this study were to develop a computerized program for objectively evaluating skeletal maturation on cephalometric radiographs, and to apply the new method to Brazilian subjects. The samples were taken from the patient files of Oral Radiological Clinics from the North, Northeast, Midwest and South regions of the country. A total of 717 subjects aged 7.0 to 15.9 years who had lateral cephalometric radiographs and hand-wrist radiographs were selected. A cervical vertebral computerized analysis was created in the Radiocef Studio 2 computer software for digital cephalometric analysis, and cervical vertebral bone age was calculated using the formulas developed by Caldas et al.17 (2007. Hand-wrist bone age was evaluated by the TW3 method. Analysis of variance (ANOVA and the Tukey test were used to compare cervical vertebral bone age, hand-wrist bone age and chronological age (P < 0.05. No significant difference was found between cervical vertebral bone age and chronological age in all regions studied. When analyzing bone age, it was possible to observe a statistically significant difference between cervical vertebral bone age and hand-wrist bone age for female and male subjects in the North and Northeast regions, as well as for male subjects in the Midwest region. No significant difference was observed between bone age and chronological age in all regions except for male subjects in the North and female subjects in the Northeast. Using cervical vertebral bone age, it might be possible to evaluate skeletal maturation in an objective manner using cephalometric radiographs.

  5. Metacognition of visual short-term memory: Dissociation between objective and subjective components of VSTM

    Directory of Open Access Journals (Sweden)

    Silvia eBona

    2013-02-01

    Full Text Available The relationship between the objective accuracy of visual-short term memory (VSTM representations and their subjective conscious experience is unknown. We investigated this issue by assessing how the objective and subjective components of VSTM in a delayed cue-target orientation discrimination task are affected by intervening distracters. On each trial, participants were shown a memory cue (a grating, the orientation of which they were asked to hold in memory. On approximately half of the trials, a distractor grating appeared during the maintenance interval; its orientation was either identical to that of the memory cue, or it differed by 10 or 40 degrees. The distractors were masked and presented briefly, so they were only consciously perceived on a subset of trials. At the end of the delay period, a memory test probe was presented, and participants were asked to indicate whether it was tilted to the left or right relative to the memory cue (VSTM accuracy; objective performance. In order to assess subjective metacognition, participants were asked indicate the vividness of their memory for the original memory cue. Finally, participants were asked rate their awareness of the distracter. Results showed that objective VSTM performance was impaired by distractors only when the distractors were very different from the cue, and that this occurred with both subjectively visible and invisible distractors. Subjective metacognition, however, was impaired by distractors of all orientations, but only when these distractors were subjectively invisible. Our results thus indicate that the objective and subjective components of VSTM are to some extent dissociable.

  6. A Hybrid System for Subjectivity Analysis

    Directory of Open Access Journals (Sweden)

    Samir Rustamov

    2018-01-01

    Full Text Available We suggested different structured hybrid systems for the sentence-level subjectivity analysis based on three supervised machine learning algorithms, namely, Hidden Markov Model, Fuzzy Control System, and Adaptive Neuro-Fuzzy Inference System. The suggested feature extraction algorithm in our experiment computes a feature vector using statistical textual terms frequencies in a training dataset not having the use of any lexical knowledge except tokenization. Taking into consideration this fact, the above-mentioned methods may be employed in other languages as these methods do not utilize the morphological, syntactical, and lexical analysis in the classification problems.

  7. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    Science.gov (United States)

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. The effect of micro and macro stressors in the work environment on computer professionals' subjective health status and productive behavior in Japan.

    Science.gov (United States)

    Tominaga, Maki; Asakura, Takashi; Akiyama, Tsuyoshi

    2007-06-01

    To investigate the effect of micro and macro stressors in the work environment on the subjective health status and productive behavior of computer professionals, we conducted a web-based investigation with Japanese IT-related company employees in 53 company unions. The questionnaire consisted of individual attributes, employment characteristics, working hour characteristics, company size and profitability, personal characteristics (i.e., Growth Need Strength), micro and macro stressors scale, and four outcome scales concerning the subjective health status and productive behavior. We obtained 1,049 Japanese IT-related company employees' data (response rate: 66%), and analyzed the data of computer engineers (80%; n=871). The results of hierarchical multiple regressions showed that each full model explained 23% in psychological distress, 20% in cumulative fatigue, 44% in job dissatisfaction, and 35% in intentions to leave, respectively. In micro stressors, "quantitative and qualitative work overload" had the strongest influence on both the subjective health status and intentions to leave. Furthermore, in macro stressors, "career and future ambiguity" was the most important predictor of the subjective health status, and "insufficient evaluation systems" and "poor supervisor's support" were important predictors of productive behavior as well. These findings suggest that improving not only micro stressors but also macro stressors will enhance the subjective health status and increase the productive behavior of computer professionals in Japan.

  9. Computer Simulations of Developmental Change: The Contributions of Working Memory Capacity and Long-Term Knowledge

    Science.gov (United States)

    Jones, Gary; Gobet, Fernand; Pine, Julian M.

    2008-01-01

    Increasing working memory (WM) capacity is often cited as a major influence on children's development and yet WM capacity is difficult to examine independently of long-term knowledge. A computational model of children's nonword repetition (NWR) performance is presented that independently manipulates long-term knowledge and WM capacity to determine…

  10. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  11. Synthesis of 13C-labelled lactose for metabolic studies in subjects with gastrointestinal disorders

    International Nuclear Information System (INIS)

    Moyna, P.

    1993-01-01

    The long-range goals included development of a 13 C-labelled lactose method for measuring lactose malabsorption in patients with diarrhea. The short-term goals included assembling a nuclear magnetic resonance system and a computer system for spectra analysis. The latter results are the subject of the report. (author)

  12. Nonparametric Monitoring for Geotechnical Structures Subject to Long-Term Environmental Change

    Directory of Open Access Journals (Sweden)

    Hae-Bum Yun

    2011-01-01

    Full Text Available A nonparametric, data-driven methodology of monitoring for geotechnical structures subject to long-term environmental change is discussed. Avoiding physical assumptions or excessive simplification of the monitored structures, the nonparametric monitoring methodology presented in this paper provides reliable performance-related information particularly when the collection of sensor data is limited. For the validation of the nonparametric methodology, a field case study was performed using a full-scale retaining wall, which had been monitored for three years using three tilt gauges. Using the very limited sensor data, it is demonstrated that important performance-related information, such as drainage performance and sensor damage, could be disentangled from significant daily, seasonal and multiyear environmental variations. Extensive literature review on recent developments of parametric and nonparametric data processing techniques for geotechnical applications is also presented.

  13. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    Science.gov (United States)

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  14. Subject-specific computational modeling of DBS in the PPTg area

    Directory of Open Access Journals (Sweden)

    Laura M. Zitella

    2015-07-01

    Full Text Available Deep brain stimulation (DBS in the pedunculopontine tegmental nucleus (PPTg has been proposed to alleviate medically intractable gait difficulties associated with Parkinson’s disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20Hz, 90µs pulse width was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4mA. These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts, which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.

  15. Subject Reference Lists Produced by Computer

    Directory of Open Access Journals (Sweden)

    Ching-chih Chen

    1968-08-01

    Full Text Available A system developed to produce fourteen subject reference lists by IBM 360/75 is described in detail. The computerized system has many advantages over conventional manual procedures. The feedback from students and other users is discussed, and some analysis of cost is included.

  16. Computation of diverging sums based on a finite number of terms

    Science.gov (United States)

    Lv, Q. Z.; Norris, S.; Pelphrey, R.; Su, Q.; Grobe, R.

    2017-10-01

    We propose a numerical method that permits us to compute the sum of a diverging series from only the first N terms by generalizing the traditional Borel technique. The method is rather robust and can be used to recover the ground state energy from the diverging perturbation theory for quantum field theoretical systems that are spatially constrained. Surprisingly, even the corresponding eigenvectors can be generated despite the intrinsic non-perturbative nature of bound state problems.

  17. Mid-term survival analysis of closed wedge high tibial osteotomy: A comparative study of computer-assisted and conventional techniques.

    Science.gov (United States)

    Bae, Dae Kyung; Song, Sang Jun; Kim, Kang Il; Hur, Dong; Jeong, Ho Yeon

    2016-03-01

    The purpose of the present study was to compare the clinical and radiographic results and survival rates between computer-assisted and conventional closing wedge high tibial osteotomies (HTOs). Data from a consecutive cohort comprised of 75 computer-assisted HTOs and 75 conventional HTOs were retrospectively reviewed. The Knee Society knee and function scores, Hospital for Special Surgery (HSS) score and femorotibial angle (FTA) were compared between the two groups. Survival rates were also compared with procedure failure. The knee and function scores at one year postoperatively were slightly better in the computer-assisted group than those in conventional group (90.1 vs. 86.1) (82.0 vs. 76.0). The HSS scores at one year postoperatively were slightly better for the computer-assisted HTOs than those of conventional HTOs (89.5 vs. 81.8). The inlier of the postoperative FTA was wider in the computer-assisted group than that in the conventional HTO group (88.0% vs. 58.7%), and mean postoperative FTA was greater in the computer-assisted group that in the conventional HTO group (valgus 9.0° vs. valgus 7.6°, pclinical and radiographic results were better in the computer-assisted group that those in the conventional HTO group. Mid-term survival rates did not differ between computer-assisted and conventional HTOs. A comparative analysis of longer-term survival rate is required to demonstrate the long-term benefit of computer-assisted HTO. III. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Analog Integrated Circuit Design for Spike Time Dependent Encoder and Reservoir in Reservoir Computing Processors

    Science.gov (United States)

    2018-01-01

    HAS BEEN REVIEWED AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE CHIEF ENGINEER : / S / / S...bridged high-performance computing, nanotechnology , and integrated circuits & systems. 15. SUBJECT TERMS neuromorphic computing, neuron design, spike...multidisciplinary effort encompassed high-performance computing, nanotechnology , integrated circuits, and integrated systems. The project’s architecture was

  19. Discrete mathematics using a computer

    CERN Document Server

    Hall, Cordelia

    2000-01-01

    Several areas of mathematics find application throughout computer science, and all students of computer science need a practical working understanding of them. These core subjects are centred on logic, sets, recursion, induction, relations and functions. The material is often called discrete mathematics, to distinguish it from the traditional topics of continuous mathematics such as integration and differential equations. The central theme of this book is the connection between computing and discrete mathematics. This connection is useful in both directions: • Mathematics is used in many branches of computer science, in applica­ tions including program specification, datastructures,design and analysis of algorithms, database systems, hardware design, reasoning about the correctness of implementations, and much more; • Computers can help to make the mathematics easier to learn and use, by making mathematical terms executable, making abstract concepts more concrete, and through the use of software tools su...

  20. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  1. Short and long term effectiveness of a subject's specific novel brain and vestibular rehabilitation treatment modality in combat veterans suffering from PTSD

    Directory of Open Access Journals (Sweden)

    Frederick Robert Carrick

    2015-06-01

    Full Text Available AbstractIntroduction: Treatment for post-traumatic stress disorder (PTSD in combat veterans that have a long-term positive clinical effect has the potential to modify the treatment of PTSD. This outcome may result in changed and saved lives of our service personnel and their families. In a previous before-after-intervention study we demonstrated high statistical and substantively significant short-term changes in the Clinician Administered DSM-IV PTSD Scale (CAPS scores after a two week trial of a subject's particular novel brain and vestibular rehabilitation (VR program. The long-term maintenance of PTSD severity reduction was the subject of this study.Material and Methods:We studied the short and long term effectiveness of a subject's particular novel brain and VR treatment of PTSD in subjects who had suffered combat-related traumatic brain injuries in terms of PTSD symptom reduction. The trial was registered as ClinicalTrials.gov Identifier: NCT02003352. We analyzed the difference in the CAPS scores pre and post treatment (one week and three months using our subjects as their matched controls. Results:The generalized least squares (GLS technique demonstrated that with our 26 subjects in the 3 timed groups the R2 within groups was 0.000, R2 between groups was 0.000 and overall the R2 was 0.000. The GLS regression was strongly statistically significant z = 21.29, p < 0.001, 95% CI [58.7, 70.63]. The linear predictive margins over time demonstrated strong statistical and substantive significance of decreasing PTSD severity scores for all timed CAPS tests.Discussion:Our investigation has the promise of the development of superior outcomes of treatments in this area that will benefit a global society. The length of the treatment intervention involved (two weeks is less that other currently available treatments and has profound implications for cost, duration of disability and outcomes in the treatment of PTSD in combat veterans.

  2. Medium-term generation programming in competitive environments: a new optimisation approach for market equilibrium computing

    International Nuclear Information System (INIS)

    Barquin, J.; Centeno, E.; Reneses, J.

    2004-01-01

    The paper proposes a model to represent medium-term hydro-thermal operation of electrical power systems in deregulated frameworks. The model objective is to compute the oligopolistic market equilibrium point in which each utility maximises its profit, based on other firms' behaviour. This problem is not an optimisation one. The main contribution of the paper is to demonstrate that, nevertheless, under some reasonable assumptions, it can be formulated as an equivalent minimisation problem. A computer program has been coded by using the proposed approach. It is used to compute the market equilibrium of a real-size system. (author)

  3. PECULIARITIES OF USING THE METHODOLOGY DISTANCE LEARNING OF THE SUBJECT «ENGINEERING AND COMPUTER GRAPHICS» FOR STUDENTS STUDYING BY CORRESPONDENCE

    OpenAIRE

    Olena V. Slobodianiuk

    2010-01-01

    A great part of the distance course of the subject «Engineering and Computer Graphics» (ECG) placed in Internet looks as an electronic manual. But the distance training process has a complicated structure and combines not only studying theoretical material but also collaboration between students and a teacher, a work in group. The methodology distance learning of ECG is proposed. This methodology developed and researched on Faculty the engineering and computer graphics of Vinnitsa National Te...

  4. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  5. Subject-Verb Agreement and Verbal Short-Term Memory: A Perspective from Greek Children with Specific Language Impairment

    Science.gov (United States)

    Lalioti, Marina; Stavrakaki, Stavroula; Manouilidou, Christina; Talli, Ioanna

    2016-01-01

    This study investigated the performance of school age Greek-speaking children with SLI on verbal short-term memory (VSTM) and Subject-Verb (S-V) agreement in comparison to chronological age controls and younger typically developing children. VSTM abilities were assessed by means of a non-word repetition task (NRT) and an elicited production task,…

  6. Procedural Memory: Computer Learning in Control Subjects and in Parkinson’s Disease Patients

    Directory of Open Access Journals (Sweden)

    C. Thomas-Antérion

    1996-01-01

    Full Text Available We used perceptual motor tasks involving the learning of mouse control by looking at a Macintosh computer screen. We studied 90 control subjects aged between sixteen and seventy-five years. There was a significant time difference between the scales of age but improvement was the same for all subjects. We also studied 24 patients with Parkinson's disease (PD. We observed an influence of age and also of educational levels. The PD patients had difficulties of learning in all tests but they did not show differences in time when compared to the control group in the first learning session (Student's t-test. They learned two or four and a half times less well than the control group. In the first test, they had some difficulty in initiating the procedure and learned eight times less well than the control group. Performances seemed to be heterogeneous: patients with only tremor (seven and patients without treatment (five performed better than others but learned less. Success in procedural tasks for the PD group seemed to depend on the capacity to initiate the response and not on the development of an accurate strategy. Many questions still remain unanswered, and we have to study different kinds of implicit memory tasks to differentiate performance in control and basal ganglia groups.

  7. Mandibular dimensions of subjects with asymmetric skeletal class III malocclusion and normal occlusion compared with cone-beam computed tomography.

    Science.gov (United States)

    Lee, HyoYeon; Bayome, Mohamed; Kim, Seong-Hun; Kim, Ki Beom; Behrents, Rolf G; Kook, Yoon-Ah

    2012-08-01

    The purpose of this study was to use cone-beam computed tomography to compare mandibular dimensions in subjects with asymmetric skeletal Class III malocclusion and those with normal occlusion. Cone-beam computed tomography scans of 38 subjects with normal occlusion and 28 patients with facial asymmetry were evaluated and digitized with Invivo software (Anatomage, San Jose, Calif). Three midsagittal and 13 right and left measurements were taken. The paired t test was used to compare the right and left sides in each group. The Mann-Whitney U test was used to compare the midsagittal variables and the differences between the 2 sides of the group with normal occlusion with those of asymmetry patients. The posterior part of the mandibular body showed significant differences between the deviated and nondeviated sides in asymmetric Class III patients. The difference of the asymmetry group was significantly greater than that of the normal occlusion group for the mediolateral ramal and the anteroposterior condylar inclinations (P = 0.007 and P = 0.019, respectively). The asymmetric skeletal Class III group showed significant differences in condylar height, ramus height, and posterior part of the mandibular body compared with the subjects with normal occlusion. These results might be useful for diagnosis and treatment planning of asymmetric Class III patients. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  8. An Alternative Method to Compute the Bit Error Probability of Modulation Schemes Subject to Nakagami- Fading

    Directory of Open Access Journals (Sweden)

    Madeiro Francisco

    2010-01-01

    Full Text Available Abstract This paper presents an alternative method for determining exact expressions for the bit error probability (BEP of modulation schemes subject to Nakagami- fading. In this method, the Nakagami- fading channel is seen as an additive noise channel whose noise is modeled as the ratio between Gaussian and Nakagami- random variables. The method consists of using the cumulative density function of the resulting noise to obtain closed-form expressions for the BEP of modulation schemes subject to Nakagami- fading. In particular, the proposed method is used to obtain closed-form expressions for the BEP of -ary quadrature amplitude modulation ( -QAM, -ary pulse amplitude modulation ( -PAM, and rectangular quadrature amplitude modulation ( -QAM under Nakagami- fading. The main contribution of this paper is to show that this alternative method can be used to reduce the computational complexity for detecting signals in the presence of fading.

  9. A Computational Framework to Optimize Subject-Specific Hemodialysis Blood Flow Rate to Prevent Intimal Hyperplasia

    Science.gov (United States)

    Mahmoudzadeh, Javid; Wlodarczyk, Marta; Cassel, Kevin

    2017-11-01

    Development of excessive intimal hyperplasia (IH) in the cephalic vein of renal failure patients who receive chronic hemodialysis treatment results in vascular access failure and multiple treatment complications. Specifically, cephalic arch stenosis (CAS) is known to exacerbate hypertensive blood pressure, thrombosis, and subsequent cardiovascular incidents that would necessitate costly interventional procedures with low success rates. It has been hypothesized that excessive blood flow rate post access maturation which strongly violates the venous homeostasis is the main hemodynamic factor that orchestrates the onset and development of CAS. In this article, a computational framework based on a strong coupling of computational fluid dynamics (CFD) and shape optimization is proposed that aims to identify the effective blood flow rate on a patient-specific basis that avoids the onset of CAS while providing the adequate blood flow rate required to facilitate hemodialysis. This effective flow rate can be achieved through implementation of Miller's surgical banding method after the maturation of the arteriovenous fistula and is rooted in the relaxation of wall stresses back to a homeostatic target value. The results are indicative that this optimized hemodialysis blood flow rate is, in fact, a subject-specific value that can be assessed post vascular access maturation and prior to the initiation of chronic hemodialysis treatment as a mitigative action against CAS-related access failure. This computational technology can be employed for individualized dialysis treatment.

  10. A computational analysis of the long-term regulation of arterial pressure.

    Science.gov (United States)

    Beard, Daniel A; Pettersen, Klas H; Carlson, Brian E; Omholt, Stig W; Bugenhagen, Scott M

    2013-01-01

    The asserted dominant role of the kidneys in the chronic regulation of blood pressure and in the etiology of hypertension has been debated since the 1970s. At the center of the theory is the observation that the acute relationships between arterial pressure and urine production-the acute pressure-diuresis and pressure-natriuresis curves-physiologically adapt to perturbations in pressure and/or changes in the rate of salt and volume intake. These adaptations, modulated by various interacting neurohumoral mechanisms, result in chronic relationships between water and salt excretion and pressure that are much steeper than the acute relationships. While the view that renal function is the dominant controller of arterial pressure has been supported by computer models of the cardiovascular system known as the "Guyton-Coleman model", no unambiguous description of a computer model capturing chronic adaptation of acute renal function in blood pressure control has been presented. Here, such a model is developed with the goals of: 1. representing the relevant mechanisms in an identifiable mathematical model; 2. identifying model parameters using appropriate data; 3. validating model predictions in comparison to data; and 4. probing hypotheses regarding the long-term control of arterial pressure and the etiology of primary hypertension. The developed model reveals: long-term control of arterial blood pressure is primarily through the baroreflex arc and the renin-angiotensin system; and arterial stiffening provides a sufficient explanation for the etiology of primary hypertension associated with ageing. Furthermore, the model provides the first consistent explanation of the physiological response to chronic stimulation of the baroreflex.

  11. Short- and long-term subjective medical treatment outcome of trauma surgery patients: the importance of physician empathy

    Directory of Open Access Journals (Sweden)

    Steinhausen S

    2014-09-01

    Full Text Available Simone Steinhausen,1 Oliver Ommen,2 Sunya-Lee Antoine,1 Thorsten Koehler,3 Holger Pfaff,4 Edmund Neugebauer11Institute for Research in Operative Medicine (IFOM, Witten/Herdecke University, Campus Cologne-Merheim, Germany; 2Federal Centre for Health Education (BZgA, Cologne, Germany; 3Institute for Applied Social Sciences (infas, Bonn, Germany; 4Institute for Medical Sociology, Health Services Research and Rehabilitation Science (IMVR, Faculty of Human Science and Faculty of Medicine, University of Cologne, Germany Purpose: To investigate accident casualties’ long-term subjective evaluation of treatment outcome 6 weeks and 12 months after discharge and its relation to the experienced surgeon’s empathy during hospital treatment after trauma in consideration of patient-, injury-, and health-related factors. The long-term results are compared to the 6-week follow-up outcomes.Patients and methods: Two hundred and seventeen surgery patients were surveyed at 6 weeks, and 206 patients at 12 months after discharge from the trauma surgical general ward. The subjective evaluation of medical treatment outcome was measured 6 weeks and 12 months after discharge with the respective scale from the Cologne Patient Questionnaire. Physician Empathy was assessed with the Consultation and Relational Empathy Measure. The correlation between physician empathy and control variables with the subjective evaluation of medical treatment outcome 12 months after discharge was identified by means of logistic regression analysis under control of sociodemographic and injury-related factors.Results: One hundred and thirty-six patients were included within the logistic regression analysis at the 12-month follow-up. Compared to the 6-week follow-up, the level of subjective evaluation of medical treatment outcome was slightly lower and the association with physician empathy was weaker. Compared to patients who rated the empathy of their surgeon lower than 31 points, patients

  12. Service task partition and distribution in star topology computer grid subject to data security constraints

    Energy Technology Data Exchange (ETDEWEB)

    Xiang Yanping [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Levitin, Gregory, E-mail: levitin@iec.co.il [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Israel electric corporation, P. O. Box 10, Haifa 31000 (Israel)

    2011-11-15

    The paper considers grid computing systems in which the resource management systems (RMS) can divide service tasks into execution blocks (EBs) and send these blocks to different resources. In order to provide a desired level of service reliability the RMS can assign the same blocks to several independent resources for parallel execution. The data security is a crucial issue in distributed computing that affects the execution policy. By the optimal service task partition into the EBs and their distribution among resources, one can achieve the greatest possible service reliability and/or expected performance subject to data security constraints. The paper suggests an algorithm for solving this optimization problem. The algorithm is based on the universal generating function technique and on the evolutionary optimization approach. Illustrative examples are presented. - Highlights: > Grid service with star topology is considered. > An algorithm for evaluating service reliability and data security is presented. > A tradeoff between the service reliability and data security is analyzed. > A procedure for optimal service task partition and distribution is suggested.

  13. Service task partition and distribution in star topology computer grid subject to data security constraints

    International Nuclear Information System (INIS)

    Xiang Yanping; Levitin, Gregory

    2011-01-01

    The paper considers grid computing systems in which the resource management systems (RMS) can divide service tasks into execution blocks (EBs) and send these blocks to different resources. In order to provide a desired level of service reliability the RMS can assign the same blocks to several independent resources for parallel execution. The data security is a crucial issue in distributed computing that affects the execution policy. By the optimal service task partition into the EBs and their distribution among resources, one can achieve the greatest possible service reliability and/or expected performance subject to data security constraints. The paper suggests an algorithm for solving this optimization problem. The algorithm is based on the universal generating function technique and on the evolutionary optimization approach. Illustrative examples are presented. - Highlights: → Grid service with star topology is considered. → An algorithm for evaluating service reliability and data security is presented. → A tradeoff between the service reliability and data security is analyzed. → A procedure for optimal service task partition and distribution is suggested.

  14. Emotion-based decision-making in healthy subjects: short-term effects of reducing dopamine levels.

    Science.gov (United States)

    Sevy, Serge; Hassoun, Youssef; Bechara, Antoine; Yechiam, Eldad; Napolitano, Barbara; Burdick, Katherine; Delman, Howard; Malhotra, Anil

    2006-10-01

    Converging evidences from animal and human studies suggest that addiction is associated with dopaminergic dysfunction in brain reward circuits. So far, it is unclear what aspects of addictive behaviors are related to a dopaminergic dysfunction. We hypothesize that a decrease in dopaminergic activity impairs emotion-based decision-making. To demonstrate this hypothesis, we investigated the effects of a decrease in dopaminergic activity on the performance of an emotion-based decision-making task, the Iowa gambling task (IGT), in 11 healthy human subjects. We used a double-blind, placebo-controlled, within-subject design to examine the effect of a mixture containing the branched-chain amino acids (BCAA) valine, isoleucine and leucine on prolactin, IGT performance, perceptual competency and visual aspects of visuospatial working memory, visual attention and working memory, and verbal memory. The expectancy-valence model was used to determine the relative contributions of distinct IGT components (attention to past outcomes, relative weight of wins and losses, and choice strategies) in the decision-making process. Compared to placebo, the BCAA mixture increased prolactin levels and impaired IGT performance. BCAA administration interfered with a particular component process of decision-making related to attention to more recent events as compared to more distant events. There were no differences between placebo and BCAA conditions for other aspects of cognition. Our results suggest a direct link between a reduced dopaminergic activity and poor emotion-based decision-making characterized by shortsightedness, and thus difficulties resisting short-term reward, despite long-term negative consequences. These findings have implications for behavioral and pharmacological interventions targeting impaired emotion-based decision-making in addictive disorders.

  15. Computation of the current density in nonlinear materials subjected to large current pulses

    International Nuclear Information System (INIS)

    Hodgdon, M.L.; Hixson, R.S.; Parsons, W.M.

    1991-01-01

    This paper reports that the finite element method and the finite difference method are used to calculate the current distribution in two nonlinear conductors. The first conductor is a small ferromagnetic wire subjected to a current pulse that rises to 10,000 Amperes in 10 microseconds. Results from the transient thermal and transient magnetic solvers of the finite element code FLUX2D are used to compute the current density in the wire. The second conductor is a metal oxide varistor. Maxwell's equations, Ohm's law and the varistor relation for the resistivity and the current density of p = αj -β are used to derive a nonlinear differential equation. The solutions of the differential equation are obtained by a finite difference approximation and a shooting method. The behavior predicted by these calculations is in agreement with experiments

  16. Simulation of skill acquisition in sequential learning of a computer game

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Nielsen, Finn Ravnsbjerg; Rasmussen, Jens

    1995-01-01

    The paper presents some theoretical assumptions about the cognitive control mechanisms of subjects learning to play a computer game. A simulation model has been developed to investigate these assumptions. The model is an automaton, reacting to instruction-like cue action rules. The prototypical...... performances of 23 experimental subjects at succeeding levels of training are compared to the performance of the model. The findings are interpreted in terms of a general taxonomy for cognitive task analysis....

  17. Experimental Testing of Monopiles in Sand Subjected to One-Way Long-Term Cyclic Lateral Loading

    DEFF Research Database (Denmark)

    Roesen, Hanne Ravn; Ibsen, Lars Bo; Andersen, Lars Vabbersgaard

    2013-01-01

    In the offshore wind turbine industry the most widely used foundation type is the monopile. Due to the wave and wind forces the monopile is subjected to a strong cyclic loading with varying amplitude, maximum loading level, and varying loading period. In this paper the soil–pile interaction...... of a monopile in sand subjected to a long-term cyclic lateral loading is investigated by means of small scale tests. The tests are conducted with a mechanical loading rig capable of applying the cyclic loading as a sine signal with varying amplitude, mean loading level, and loading period for more than 60 000...... cycles. The tests are conducted in dense saturated sand. The maximum moment applied in the cyclic tests is varied from 18% to 36% of the ultimate lateral resistance found in a static loading test. The tests reveal that the accumulated rotation can be expressed by use of a power function. Further, static...

  18. Long-term changes of information environments and computer anxiety of nurse administrators in Japan.

    Science.gov (United States)

    Majima, Yukie; Izumi, Takako

    2013-01-01

    In Japan, medical information systems, including electronic medical records, are being introduced increasingly at medical and nursing fields. Nurse administrators, who are involved in the introduction of medical information systems and who must make proper judgment, are particularly required to have at least minimal knowledge of computers and networks and the ability to think about easy-to-use medical information systems. However, few of the current generation of nurse administrators studied information science subjects in their basic education curriculum. It can be said that information education for nurse administrators has become a pressing issue. Consequently, in this study, we conducted a survey of participants taking the first level program of the education course for Japanese certified nurse administrators to ascertain the actual conditions, such as the information environments that nurse administrators are in, their anxiety attitude to computers. Comparisons over the seven years since 2004 revealed that although introduction of electronic medical records in hospitals was progressing, little change in attributes of participants taking the course was observed, such as computer anxiety.

  19. The Use of Computer-Based Videogames in Knowledge Acquisition and Retention.

    Science.gov (United States)

    Ricci, Katrina E.

    1994-01-01

    Research conducted at the Naval Training Systems Center in Orlando, Florida, investigated the acquisition and retention of basic knowledge with subject matter presented in the forms of text, test, and game. Results are discussed in terms of the effectiveness of computer-based games for military training. (Author/AEF)

  20. A short-term, comprehensive, yoga-based lifestyle intervention is efficacious in reducing anxiety, improving subjective well-being and personality

    Directory of Open Access Journals (Sweden)

    Raj Kumar Yadav

    2012-01-01

    Full Text Available Objective: To assess the efficacy of a short-term comprehensive yoga-based lifestyle intervention in reducing anxiety, improving subjective well-being and personality. Materials and Methods: The study is a part of an ongoing larger study at a tertiary care hospital. Participants (n=90 included patients with chronic diseases attending a 10-day, yoga-based lifestyle intervention program for prevention and management of chronic diseases, and healthy controls (n=45 not attending any such intervention. Primary Outcome Measures: Change in state and trait anxiety questionnaire (STAI-Y; 40 items, subjective well-being inventory (SUBI; 40 items, and neuroticism extraversion openness to experience five factor personality inventory revised (NEO-FF PI-R; 60 items at the end of intervention. Results: Following intervention, the STAI-Y scores reduced significantly (P0.01 at Day 10 versus Day 1. Similarly NEO-FF PI-R scores improved significantly (P<0.001 at Day 10 versus Day 1. Control group showed an increase in STAI-Y while SUBI and NEO-FF PI-R scores remained comparable at Day 10 versus Day 1. Conclusions: The observations suggest that a short-term, yoga-based lifestyle intervention may significantly reduce anxiety and improve subjective well-being and personality in patients with chronic diseases.

  1. New computational paradigms changing conceptions of what is computable

    CERN Document Server

    Cooper, SB; Sorbi, Andrea

    2007-01-01

    This superb exposition of a complex subject examines new developments in the theory and practice of computation from a mathematical perspective. It covers topics ranging from classical computability to complexity, from biocomputing to quantum computing.

  2. Computational Physics as a Path for Physics Education

    Science.gov (United States)

    Landau, Rubin H.

    2008-04-01

    Evidence and arguments will be presented that modifications in the undergraduate physics curriculum are necessary to maintain the long-term relevance of physics. Suggested will a balance of analytic, experimental, computational, and communication skills, that in many cases will require an increased inclusion of computation and its associated skill set into the undergraduate physics curriculum. The general arguments will be followed by a detailed enumeration of suggested subjects and student learning outcomes, many of which have already been adopted or advocated by the computational science community, and which permit high performance computing and communication. Several alternative models for how these computational topics can be incorporated into the undergraduate curriculum will be discussed. This includes enhanced topics in the standard existing courses, as well as stand-alone courses. Applications and demonstrations will be presented throughout the talk, as well as prototype video-based materials and electronic books.

  3. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  4. SHORT TERM EFFECT OF ACUPUNCTURE-TENS ON LUNG FUNCTIONS AND DYSPNEA FOR SUBJECTS WITH MODERATE COPD

    Directory of Open Access Journals (Sweden)

    Vinod Babu. K

    2015-10-01

    Full Text Available Background: Acupuncture TENS is used to improve pain instead of invasive acupuncture. Acupuncture shown to improve dyspnoea and lung functions in COPD (Chronic Obstructive Pulmonary Disease patients. The purpose of the study is to determine Short term effectiveness of Acupuncture-TENS in reducing dyspnea and improving lung functions for subjects with moderate COPD. Method: An experimental study design, selected 30 geriatric subjects with COPD randomized 15 subjects into each Study and Control group. Study group received Acu-TENS for 45 minutes for total 5 sessions, while control group received placebo TENS. Outcome measurements such as breathlessness using Modified Borg Scale (MBS, Lung functions using Pulmonary Function Test (PFT was measured before and after intervention. Results: Analysis from pre-intervention to post-intervention within study group found that there is statistically significant change in means of MBS, FEV1, FEV1/FVC ratio and within control group there is a statistically significant change in means of MBS, but there is no statistically significant change in means of FEV1, FVC and FEV1/FVC ratio. When post-intervention means were compared between the groups there is no statistically significant difference in means of MBS and FEV1, FVC and FEV1/FVC ratio. Conclusion: It is concluded that one week of Acu-TENS on EXL1 point found no significant effect on improving dyspnea and lung functions in subjects with moderate COPD in geriatric populations.

  5. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  6. Relevance of a subjective quality of life questionnaire for long-term homeless persons with schizophrenia.

    Science.gov (United States)

    Girard, V; Tinland, A; Bonin, J P; Olive, F; Poule, J; Lancon, C; Apostolidis, T; Rowe, M; Greacen, T; Simeoni, M C

    2017-02-17

    Increasing numbers of programs are addressing the specific needs of homeless people with schizophrenia in terms of access to housing, healthcare, basic human rights and other domains. Although quality of life scales are being used to evaluate such programs, few instruments have been validated for people with schizophrenia and none for people with schizophrenia who experience major social problems such as homelessness. The aim of the present study was to validate the French version of the S-QoL a self-administered, subjective quality of life questionnaire specific to schizophrenia for people with schizophrenia who are homeless. In a two-step process, the S-QoL was first administered to two independent convenience samples of long-term homeless people with schizophrenia in Marseille, France. The objective of the first step was to analyse the psychometric properties of the S-QoL. The objective of the second step was to examine, through qualitative interviews with members of the population in question, the relevance and acceptability of the principle quality of life indicators used in the S-QoL instrument. Although the psychometric characteristics of the S-QoL were found to be globally satisfactory, from the point of view of the people being interviewed, acceptability was poor. Respondents frequently interrupted participation complaining that questionnaire items did not take into account the specific context of life on the streets. Less intrusive questions, more readily understandable vocabulary and greater relevance to subjects' living conditions are needed to improve the S-QoL questionnaire for this population. A modular questionnaire with context specific sections or specific quality of life instruments for socially excluded populations may well be the way forward.

  7. SWAAM-LT: The long-term, sodium/water reaction analysis method computer code

    International Nuclear Information System (INIS)

    Shin, Y.W.; Chung, H.H.; Wiedermann, A.H.; Tanabe, H.

    1993-01-01

    The SWAAM-LT Code, developed for analysis of long-term effects of sodium/water reactions, is discussed. The theoretical formulation of the code is described, including the introduction of system matrices for ease of computer programming as a general system code. Also, some typical results of the code predictions for available large scale tests are presented. Test data for the steam generator design with the cover-gas feature and without the cover-gas feature are available and analyzed. The capabilities and limitations of the code are then discussed in light of the comparison between the code prediction and the test data

  8. Quantification of radiation absorbed dose and DNA damages in subjects undergoing computer tomography imaging

    International Nuclear Information System (INIS)

    Kanagaraj, Karthik; Basheerudeen, Safa Abdul Syed; Tamizh Selvan, G.; Venkatachalam, Perumal; Jose, M.T.; Ozhimuthu, Annalakshmi; Panneer Selvam, S.; Pattan, Sudha

    2014-01-01

    X-rays are extensively used in medical field for imaging, diagnostic radiology and radiotherapy. Irrespective of the application, the procedures deliver a significant amount of dose to the subject, while undergoing the procedure, which vary from imaging (low dose in the order mGy) and therapy (high doses in the order of several Gy). Of the various imaging modalities, the computed tomography (CT) is commonly used to diagnose many health alignments, in all age groups. Though the personals involved in performing the procedures are monitored for the levels of exposure, it is uncommon to monitor the patient after the examination, as the benefits outweigh the risk. However an enhanced concern on the risk associated due to the exposure of low dose X-radiation in CT has been reported. Therefore, we aim to quantify the absorbed dose to the eye, thyroid and forehead using thermo luminescence dosimeter of Lithium Manganese Borate doped with Terbium (LMB:Tb) in subjects undergoing CT examination (n= 27), as a methodology to investigate the effects of low dose ionizing radiation. Further, the DNA damages was measured using chromosomal aberration (CA) and micronucleus (MN) assay, from the blood samples obtained from the study subjects before and after the procedures. The overall measured organ dose ranged between 1.92 and 520.14 mGy for eye, 0.84 and 210.33 mGy for forehead and 1.79-185 mGy for thyroid, with an average of 128.86 1 137.16, 78.25 1 69.02 and 48.86 1 63.60 respectively. The DNA damages measured using CA and MN assay, showed an extreme statistically significant (p<0.0001) increase in CA and significant increase (p<0.001) in MN frequency in post exposure when compared to that of unexposed control. The significance of the estimated dose and the DNA damages will be discussed. (author)

  9. Near-term quantum computing for applications

    Data.gov (United States)

    National Aeronautics and Space Administration — From habitat automation to navigation and scheduling of tasks to networking, the challenges of modern space exploration are as much computational as they are...

  10. The vocabulary of anglophone psychology in the context of other subjects.

    Science.gov (United States)

    Benjafield, John G

    2013-02-01

    Anglophone psychology shares its vocabulary with several other subjects. Some of the more obvious subjects that have parts of their vocabulary in common with Anglophone psychology include biology (e.g., dominance), chemistry (e.g., isomorphism), philosophy (e.g., phenomenology), and theology (e.g., mediator). Using data from the Oxford English Dictionary as well as other sources, the present study explored the history of these common vocabularies, with a view to broadening our understanding of the relation between the history of psychology and the histories of other subjects. It turns out that there are at least 156 different subjects that share words with psychology. Those that have the most words in common with psychology are mathematics, biology, physics, medicine, chemistry, philosophy, law, music, linguistics, electricity, pathology, and computing. Words that have senses in other subjects and have their origins in ordinary language are used more frequently as PsycINFO keywords than words that were invented specifically for use in psychology. These and other results are interpreted in terms of the ordinary language roots of the vocabulary of Anglophone psychology and other subjects, the degree to which operational definitions have determined the meaning of the psychological senses of words, the role of the psychologist in interdisciplinary research, and the validity of psychological essentialism.

  11. Subjective poverty line definitions

    NARCIS (Netherlands)

    J. Flik; B.M.S. van Praag (Bernard)

    1991-01-01

    textabstractIn this paper we will deal with definitions of subjective poverty lines. To measure a poverty threshold value in terms of household income, which separates the poor from the non-poor, we take into account the opinions of all people in society. Three subjective methods will be discussed

  12. Subject search study. Final report

    International Nuclear Information System (INIS)

    Todeschini, C.

    1995-01-01

    The study gathered information on how users search the database of the International Nuclear Information System (INIS), using indicators such as Subject categories, Controlled terms, Subject headings, Free-text words, combinations of the above. Users participated from the Australian, French, Russian and Spanish INIS Centres, that have different national languages. Participants, both intermediaries and end users, replied to a questionnaire and executed search queries. The INIS Secretariat at the IAEA also participated. A protocol of all search strategies used in actual searches in the database was kept. The thought process for Russian and Spanish users is predominantly non-English and also the actual initial search formulation is predominantly non-English among Russian and Spanish users while it tends to be more in English among French users. A total of 1002 searches were executed by the five INIS centres including the IAEA. The search protocols indicate the following search behaviour: 1) free text words represent about 40% of search points on an average query; 2) descriptors used as search keys have the widest range as percentage of search points, from a low of 25% to a high of 48%; 3) search keys consisting of free text that coincides with a descriptor account for about 15% of search points; 4) Subject Categories are not used in many searches; 5) free text words are present as search points in about 80% of all searches; 6) controlled terms (descriptors) are used very extensively and appear in about 90% of all searches; 7) Subject Headings were used in only a few percent of searches. From the results of the study one can conclude that there is a greater reluctance on the part of non-native English speakers in initiating their searches by using free text word searches. Also: Subject Categories are little used in searching the database; both free text terms and controlled terms are the predominant types of search keys used, whereby the controlled terms are used more

  13. Computation of the Coupling Resonance Driving term f1001 and the coupling coefficient C from turn-by-turn single-BPM data.

    CERN Document Server

    Franchi, A; Vanbavinkhove, G; CERN. Geneva. BE Department

    2010-01-01

    In this note we show how to compute the Resonance Driving Term (RDT) f1001, the local resonance term chi 1010 and the coupling coefficient C from the spectrum of turn-by-turn single-BPM data. The harmonic analysis of real coordinate x(y) is model independent, conversely to the the analysis of the complex Courant-Snyder coordinate hx,- = x-ipx. From the computation of f1001 along the ring is closely related to the global coupling coefficient C, but it is affected by an intrinsic error, discussed in this note.

  14. Generation of a suite of 3D computer-generated breast phantoms from a limited set of human subject data

    International Nuclear Information System (INIS)

    Hsu, Christina M. L.; Palmeri, Mark L.; Segars, W. Paul; Veress, Alexander I.; Dobbins, James T. III

    2013-01-01

    Purpose: The authors previously reported on a three-dimensional computer-generated breast phantom, based on empirical human image data, including a realistic finite-element based compression model that was capable of simulating multimodality imaging data. The computerized breast phantoms are a hybrid of two phantom generation techniques, combining empirical breast CT (bCT) data with flexible computer graphics techniques. However, to date, these phantoms have been based on single human subjects. In this paper, the authors report on a new method to generate multiple phantoms, simulating additional subjects from the limited set of original dedicated breast CT data. The authors developed an image morphing technique to construct new phantoms by gradually transitioning between two human subject datasets, with the potential to generate hundreds of additional pseudoindependent phantoms from the limited bCT cases. The authors conducted a preliminary subjective assessment with a limited number of observers (n= 4) to illustrate how realistic the simulated images generated with the pseudoindependent phantoms appeared. Methods: Several mesh-based geometric transformations were developed to generate distorted breast datasets from the original human subject data. Segmented bCT data from two different human subjects were used as the “base” and “target” for morphing. Several combinations of transformations were applied to morph between the “base’ and “target” datasets such as changing the breast shape, rotating the glandular data, and changing the distribution of the glandular tissue. Following the morphing, regions of skin and fat were assigned to the morphed dataset in order to appropriately assign mechanical properties during the compression simulation. The resulting morphed breast was compressed using a finite element algorithm and simulated mammograms were generated using techniques described previously. Sixty-two simulated mammograms, generated from morphing

  15. Memory and subjective workload assessment

    Science.gov (United States)

    Staveland, L.; Hart, S.; Yeh, Y. Y.

    1986-01-01

    Recent research suggested subjective introspection of workload is not based upon specific retrieval of information from long term memory, and only reflects the average workload that is imposed upon the human operator by a particular task. These findings are based upon global ratings of workload for the overall task, suggesting that subjective ratings are limited in ability to retrieve specific details of a task from long term memory. To clarify the limits memory imposes on subjective workload assessment, the difficulty of task segments was varied and the workload of specified segments was retrospectively rated. The ratings were retrospectively collected on the manipulations of three levels of segment difficulty. Subjects were assigned to one of two memory groups. In the Before group, subjects knew before performing a block of trials which segment to rate. In the After group, subjects did not know which segment to rate until after performing the block of trials. The subjective ratings, RTs (reaction times) and MTs (movement times) were compared within group, and between group differences. Performance measures and subjective evaluations of workload reflected the experimental manipulations. Subjects were sensitive to different difficulty levels, and recalled the average workload of task components. Cueing did not appear to help recall, and memory group differences possibly reflected variations in the groups of subjects, or an additional memory task.

  16. Identification of Units and Other Terms in Czech Medical Records

    Czech Academy of Sciences Publication Activity Database

    Zvára Jr., Karel; Kašpar, Václav

    2010-01-01

    Roč. 6, č. 1 (2010), s. 78-82 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : natural language processing * healthcare documentation * medical reports * EHR * finite-state machine * regular expression Subject RIV: IN - Informatics, Computer Science http://www.ejbi.org/en/ejbi/article/61-en-identification-of-units-and-other-terms-in-czech-medical-records.html

  17. Impaired basal glucose effectiveness but unaltered fasting glucose release and gluconeogenesis during short-term hypercortisolemia in healthy subjects

    DEFF Research Database (Denmark)

    Nielsen, Michael F; Caumo, Andrea; Chandramouli, Visvanathan

    2004-01-01

    Excess cortisol has been demonstrated to impair hepatic and extrahepatic insulin action. To determine whether glucose effectiveness and, in terms of endogenous glucose release (EGR), gluconeogenesis, also are altered by hypercortisolemia, eight healthy subjects were studied after overnight infusion...... resistance. Postabsorptive glucose production (P = 0.64) and the fractional....... Hepatic GE was lower during cortisol than during saline infusion (2.39 +/- 0.24 vs. 3.82 +/- 0.51 ml.kg-1.min-1; P

  18. Incidence of lumbar spondylolysis in the general population in Japan based on multidetector computed tomography scans from two thousand subjects.

    Science.gov (United States)

    Sakai, Toshinori; Sairyo, Koichi; Takao, Shoichiro; Nishitani, Hiromu; Yasui, Natsuo

    2009-10-01

    Epidemiological analysis using CTs. To investigate the true incidence of lumbar spondylolysis in the general population in Japan. Although there have been several reports on the incidence of lumbar spondylolysis, they had some weakness. One of them concerns the subjects investigated, because the incidence of lumbar spondylolysis varies considerably, and some patients are asymptomatic. In addition, most of the past studies used plain radiograph films or skeletal investigation. Therefore, the past reported incidence may not correspond to that of the general population. We reviewed the computed tomography (CT) scans of 2000 subjects (age: 20-92 years) who had undergone abdominal and pelvic CT on a single multidetector CT scanner for reasons unrelated to low back pain. We reviewed them for spondylolysis, spondylolytic spondylolisthesis, and spina bifida occulta (SBO) in the lumbosacral region. The grade (I-IV) of spondylolisthesis was measured using midsagittal reconstructions. Lumbar spondylolysis was found in 117 subjects (5.9%). Their male-female ratio was 2:1. Multiple-level spondylolysis was found in 5 subjects (0.3%). Among these 117 subjects, there were 124 vertebrae with spondylolysis. Of them, 112 (90.3%) corresponded to L5, and 26 (21.0%) had unilateral spondylolysis.SBO was found in 154 subjects. Of them, 25 had spondylolysis (16.2%), whereas, in 1846 subjects without SBO, 92 had spondylolysis (5.0%). The incidence of spondylolysis among the patients with SBO was significantly higher than that in subjects without SBO (Odd ratio was 3.7-fold).Of 124 vertebrae with spondylolysis, 75 (60.5%) showed low-grade (Meyerding grade I or II) spondylolisthesis, and no subject presented high-grade spondylolisthesis. Spondylolisthesis was found in 74.5% of the subjects with bilateral spondylolysis, and in 7.7% of those with unilateral spondylolysis. The incidence of lumbar spondylolysis in the Japanese general population was 5.9% (males: 7.9%, females: 3.9%).

  19. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  20. Medical students as human subjects in educational research

    Directory of Open Access Journals (Sweden)

    Adina L. Kalet

    2013-02-01

    Full Text Available Introduction: Special concerns often arise when medical students are themselves the subjects of education research. A recently completed large, multi-center randomized controlled trial of computer-assisted learning modules for surgical clerks provided the opportunity to explore the perceived level of risk of studies where medical students serve as human subjects by reporting on: 1 the response of Institutional Review Boards (IRBs at seven institutions to the same study protocol; and 2 the thoughts and feelings of students across study sites about being research subjects. Methods: From July 2009 to August 2010, all third-year medical students at seven collaborating institutions were eligible to participate. Patterns of IRB review of the same protocol were compared. Participation burden was calculated in terms of the time spent interacting with the modules. Focus groups were conducted with medical students at each site. Transcripts were coded by three independent reviewers and analyzed using Atlas.ti. Results: The IRBs at the seven participating institutions granted full (n=1, expedited (n=4, or exempt (n=2 review of the WISE Trial protocol. 995 (73% of those eligible consented to participate, and 207 (20% of these students completed all outcome measures. The average time to complete the computer modules and associated measures was 175 min. Common themes in focus groups with participant students included the desire to contribute to medical education research, the absence of coercion to consent, and the low-risk nature of the research. Discussion: Our findings demonstrate that risk assessment and the extent of review utilized for medical education research vary among IRBs. Despite variability in the perception of risk implied by differing IRB requirements, students themselves felt education research was low risk and did not consider themselves to be vulnerable. The vast majority of eligible medical students were willing to participate as research

  1. Girls and Computing: Female Participation in Computing in Schools

    Science.gov (United States)

    Zagami, Jason; Boden, Marie; Keane, Therese; Moreton, Bronwyn; Schulz, Karsten

    2015-01-01

    Computer education, with a focus on Computer Science, has become a core subject in the Australian Curriculum and the focus of national innovation initiatives. Equal participation by girls, however, remains unlikely based on their engagement with computing in recent decades. In seeking to understand why this may be the case, a Delphi consensus…

  2. Long-term Results after CT-Guided Percutaneous Ethanol Ablation for the Treatment of Hyperfunctioning Adrenal Disorders

    Directory of Open Access Journals (Sweden)

    Nathan Elie Frenk

    Full Text Available OBJECTIVES: To evaluate the safety and long-term efficacy of computed tomography-guided percutaneous ethanol ablation for benign primary and secondary hyperfunctioning adrenal disorders. METHOD: We retrospectively evaluated the long-term results of nine patients treated with computed tomography-guided percutaneous ethanol ablation: eight subjects who presented with primary adrenal disorders, such as pheochromocytoma, primary macronodular adrenal hyperplasia and aldosterone-producing adenoma, and one subject with Cushing disease refractory to conventional treatment. Eleven sessions were performed for the nine patients. The patient data were reviewed for the clinical outcome and procedure-related complications over ten years. RESULTS: Patients with aldosterone-producing adenoma had clinical improvement: symptoms recurred in one case 96 months after ethanol ablation, and the other patient was still in remission 110 months later. All patients with pheochromocytoma had clinical improvement but were eventually submitted to surgery for complete remission. No significant clinical improvement was seen in patients with hypercortisolism due to primary macronodular adrenal hyperplasia or Cushing disease. Major complications were seen in five of the eleven procedures and included cardiovascular instability and myocardial infarction. Minor complications attributed to sedation were seen in two patients. CONCLUSION: Computed tomography-guided ethanol ablation does not appear to be suitable for the long-term treatment of hyperfunctioning adrenal disorders and is not without risks.

  3. New computer security measures

    CERN Multimedia

    IT Department

    2008-01-01

    As a part of the long-term strategy to improve computer security at CERN, and especially given the attention focused to CERN by the start-up of the LHC, two additional security measures concerning DNS and Tor will shortly be introduced. These are described in the following texts and will affect only a small number of users. "PHISHING" ATTACKS CONTINUE CERN computer users continue to be subjected to attacks by people trying to infect our machines and obtain passwords and other confidential information by social engineering trickery. Recent examples include an e-mail message sent from "La Poste" entitled "Colis Postal" on 21 August, a fake mail sent from web and mail services on 8 September, and an e-mail purporting to come from Hallmark Cards announcing the arrival of an electronic postcard. However, there are many other examples and there are reports of compromised mail accounts being used for more realistic site-specific phishing attempts. Given the increased publicity rela...

  4. Third-order least squares modelling of milling state term for improved computation of stability boundaries

    Directory of Open Access Journals (Sweden)

    C.G. Ozoegwu

    2016-01-01

    Full Text Available The general least squares model for milling process state term is presented. A discrete map for milling stability analysis that is based on the third-order case of the presented general least squares milling state term model is first studied and compared with its third-order counterpart that is based on the interpolation theory. Both numerical rate of convergence and chatter stability results of the two maps are compared using the single degree of freedom (1DOF milling model. The numerical rate of convergence of the presented third-order model is also studied using the two degree of freedom (2DOF milling process model. Comparison gave that stability results from the two maps agree closely but the presented map demonstrated reduction in number of needed calculations leading to about 30% savings in computational time (CT. It is seen in earlier works that accuracy of milling stability analysis using the full-discretization method rises from first-order theory to second-order theory and continues to rise to the third-order theory. The present work confirms this trend. In conclusion, the method presented in this work will enable fast and accurate computation of stability diagrams for use by machinists.

  5. Symptom Severity and Quality of Life Among Long-term Colorectal Cancer Survivors Compared With Matched Control Subjects: A Population-Based Study.

    Science.gov (United States)

    Hart, Tae L; Charles, Susan T; Gunaratne, Mekhala; Baxter, Nancy N; Cotterchio, Michelle; Cohen, Zane; Gallinger, Steven

    2018-03-01

    Data are lacking regarding physical functioning, psychological well-being, and quality of life among colorectal cancer survivors >10 years postdiagnosis. The purpose of this study was to examine self-reported physical functioning, quality of life, and psychological well-being in long-term colorectal cancer survivors compared with age- and sex-matched unaffected control subjects. Participants completed a cross-sectional survey. The colorectal cancer survivors and unaffected control subjects were recruited from the Ontario Familial Colorectal Cancer Registry. A population-based sample of colorectal cancer survivors (N = 296) and their age- and sex-matched unaffected control subjects (N = 255) were included. Survivors were, on average, 15 years postdiagnosis. Quality of life was measured with the Functional Assessment of Cancer Therapy-General scale, bowel dysfunction with the Memorial Sloan-Kettering Cancer Center scale, urinary dysfunction with the International Consultation on Incontinence Questionnaire-Short Form, fatigue with the Functional Assessment of Chronic Illness Therapy-Fatigue scale, and depression with the Center for Epidemiologic Studies-Depression scale. In linear mixed-model analyses adjusting for income, education, race, and comorbid medical conditions, survivors reported good emotional, functional, physical, and overall quality of life, comparable to control subjects. Fatigue and urinary functioning did not differ significantly between survivors and control subjects. Survivors reported significantly higher social quality of life and lower depression compared with unaffected control subjects. The only area where survivors reported significantly worse deficits was in bowel dysfunction, but the magnitude of differences was relatively small. Generalizability is limited by moderately low participation rates. Findings are likely biased toward healthy participants. No baseline assessment was available to examine change in outcomes over time. Long-term

  6. Examining Computer Gaming Addiction in Terms of Different Variables

    Science.gov (United States)

    Kurt, Adile Askim; Dogan, Ezgi; Erdogmus, Yasemin Kahyaoglu; Emiroglu, Bulent Gursel

    2018-01-01

    The computer gaming addiction is one of the newer concepts that young generations face and can be defined as the excessive and problematic use of computer games leading to social and/or emotional problems. The purpose of this study is to analyse through variables the computer gaming addiction levels of secondary school students. The research was…

  7. Digital Inclusion of Secondary Schools' Subject Teachers in Bolivia

    Science.gov (United States)

    Popova, Iskra; Fabre, Gabriela

    2017-01-01

    The government of Bolivia planned to introduce information technology in secondary education through establishing computer labs in schools and through granting each subject teacher a laptop. This initiative was tested for the first time in 2012 with three public schools in La Paz. Most of the subject teachers have never used a computer before. The…

  8. Investigations of incorporating source directivity into room acoustics computer models to improve auralizations

    Science.gov (United States)

    Vigeant, Michelle C.

    Room acoustics computer modeling and auralizations are useful tools when designing or modifying acoustically sensitive spaces. In this dissertation, the input parameter of source directivity has been studied in great detail to determine first its effect in room acoustics computer models and secondly how to better incorporate the directional source characteristics into these models to improve auralizations. To increase the accuracy of room acoustics computer models, the source directivity of real sources, such as musical instruments, must be included in the models. The traditional method for incorporating source directivity into room acoustics computer models involves inputting the measured static directivity data taken every 10° in a sphere-shaped pattern around the source. This data can be entered into the room acoustics software to create a directivity balloon, which is used in the ray tracing algorithm to simulate the room impulse response. The first study in this dissertation shows that using directional sources over an omni-directional source in room acoustics computer models produces significant differences both in terms of calculated room acoustics parameters and auralizations. The room acoustics computer model was also validated in terms of accurately incorporating the input source directivity. A recently proposed technique for creating auralizations using a multi-channel source representation has been investigated with numerous subjective studies, applied to both solo instruments and an orchestra. The method of multi-channel auralizations involves obtaining multi-channel anechoic recordings of short melodies from various instruments and creating individual channel auralizations. These auralizations are then combined to create a total multi-channel auralization. Through many subjective studies, this process was shown to be effective in terms of improving the realism and source width of the auralizations in a number of cases, and also modeling different

  9. Short-term memory in zebrafish (Danio rerio).

    Science.gov (United States)

    Jia, Jason; Fernandes, Yohaan; Gerlai, Robert

    2014-08-15

    Learning and memory represent perhaps the most complex behavioral phenomena. Although their underlying mechanisms have been extensively analyzed, only a fraction of the potential molecular components have been identified. The zebrafish has been proposed as a screening tool with which mechanisms of complex brain functions may be systematically uncovered. However, as a relative newcomer in behavioral neuroscience, the zebrafish has not been well characterized for its cognitive and mnemonic features, thus learning and/or memory screens with adults have not been feasible. Here we study short-term memory of adult zebrafish. We show animated images of conspecifics (the stimulus) to the experimental subject during 1 min intervals on ten occasions separated by different (2, 4, 8 or 16 min long) inter-stimulus intervals (ISI), a between subject experimental design. We quantify the distance of the subject from the image presentation screen during each stimulus presentation interval, during each of the 1-min post-stimulus intervals immediately following the stimulus presentations and during each of the 1-min intervals furthest away from the last stimulus presentation interval and just before the next interval (pre-stimulus interval), respectively. Our results demonstrate significant retention of short-term memory even in the longest ISI group but suggest no acquisition of reference memory. Because in the employed paradigm both stimulus presentation and behavioral response quantification is computer automated, we argue that high-throughput screening for drugs or mutations that alter short-term memory performance of adult zebrafish is now becoming feasible. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. The Basics of Cloud Computing

    Science.gov (United States)

    Kaestner, Rich

    2012-01-01

    Most school business officials have heard the term "cloud computing" bandied about and may have some idea of what the term means. In fact, they likely already leverage a cloud-computing solution somewhere within their district. But what does cloud computing really mean? This brief article puts a bit of definition behind the term and helps one…

  11. Long-term results after CT-guided percutaneous ethanol ablation for the treatment of hyper functioning adrenal disorders

    International Nuclear Information System (INIS)

    Frenk, Nathan Elie; Sebastianes, Fernando; Lerario, Antonio Marcondes; Fragoso, Maria Candida Barisson Villares; Mendonca, Berenice Bilharinho

    2016-01-01

    Objectives: To evaluate the safety and long-term efficacy of computed tomography-guided percutaneous ethanol ablation for benign primary and secondary hyper functioning adrenal disorders. Method: We retrospectively evaluated the long-term results of nine patients treated with computed tomography guided percutaneous ethanol ablation: eight subjects who presented with primary adrenal disorders, such as pheochromocytoma, primary macro nodular adrenal hyperplasia and aldosterone-producing adenoma, and one subject with Cushing disease refractory to conventional treatment. Eleven sessions were performed for the nine patients. The patient data were reviewed for the clinical outcome and procedure-related complications over ten years. Results: Patients with aldosterone-producing adenoma had clinical improvement: symptoms recurred in one case 96 months after ethanol ablation, and the other patient was still in remission 110 months later. All patients with pheochromocytoma had clinical improvement but were eventually submitted to surgery for complete remission. No significant clinical improvement was seen in patients with hypercortisolism due to primary macro nodular adrenal hyperplasia or Cushing disease. Major complications were seen in five of the eleven procedures and included cardiovascular instability and myocardial infarction. Minor complications attributed to sedation were seen in two patients. Conclusion: Computed tomography-guided ethanol ablation does not appear to be suitable for the long-term treatment of hyper functioning adrenal disorders and is not without risks. (author)

  12. Long-term results after CT-guided percutaneous ethanol ablation for the treatment of hyper functioning adrenal disorders

    Energy Technology Data Exchange (ETDEWEB)

    Frenk, Nathan Elie; Sebastianes, Fernando; Lerario, Antonio Marcondes; Fragoso, Maria Candida Barisson Villares; Mendonca, Berenice Bilharinho [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina; Menezes, Marcos Roberto de, E-mail: menezesmr@gmail.com [Instituto do Cancer do Estado de Sao Paulo, SP (Brazil)

    2016-10-15

    Objectives: To evaluate the safety and long-term efficacy of computed tomography-guided percutaneous ethanol ablation for benign primary and secondary hyper functioning adrenal disorders. Method: We retrospectively evaluated the long-term results of nine patients treated with computed tomography guided percutaneous ethanol ablation: eight subjects who presented with primary adrenal disorders, such as pheochromocytoma, primary macro nodular adrenal hyperplasia and aldosterone-producing adenoma, and one subject with Cushing disease refractory to conventional treatment. Eleven sessions were performed for the nine patients. The patient data were reviewed for the clinical outcome and procedure-related complications over ten years. Results: Patients with aldosterone-producing adenoma had clinical improvement: symptoms recurred in one case 96 months after ethanol ablation, and the other patient was still in remission 110 months later. All patients with pheochromocytoma had clinical improvement but were eventually submitted to surgery for complete remission. No significant clinical improvement was seen in patients with hypercortisolism due to primary macro nodular adrenal hyperplasia or Cushing disease. Major complications were seen in five of the eleven procedures and included cardiovascular instability and myocardial infarction. Minor complications attributed to sedation were seen in two patients. Conclusion: Computed tomography-guided ethanol ablation does not appear to be suitable for the long-term treatment of hyper functioning adrenal disorders and is not without risks. (author)

  13. Assessment of clinical residents' needs for ten educational subjects

    Directory of Open Access Journals (Sweden)

    Mansour Razavi

    2002-04-01

    Full Text Available Background Fulfilling the learners' "real needs" will improve medical education. There are subjects that are necessary for any clinical residents not considering their field of specialty. Among the subjects ten seems to be the most important: research methodology and data analysis, computer-based programs, medical recording, cardiopulmonary and cerebral resuscitation, clinical teaching programs, communication skills, clinical ethics, laboratory examinations, reporting special diseases and death certification, and prescription. Purpose This cross-sectional study assessed educational needs of clinical residents for ten educational subjects. Methods A questionnaire prepared by board faculty members consisted of 10 close-ended questions, and one open­ ended question was distributed among 1307 residents from 22 clinical disciplines, who registered for preboard or promotion exam in June 2000. Results Among the subjects three were the most needed: computer-based programs 149 (60%, data collecting system 606 (49%, and clinical ethics 643 (46%. The prescription standard was the least required 177(13%. Conclusion Complementary training courses on these subjects can be an answer to the clinical residents needs. Keywords : research methodology, computer in medicine, cpr, clinical teaching methods, communication in medicine, medical ethics, laboratory ordering, disease coding system, death certificate, prescription writing

  14. Report of the Review Committee of the R and D subjects on Computational Science and Engineering

    International Nuclear Information System (INIS)

    1999-08-01

    The Ad Hoc Review Committee composed of seven experts was set up under the Research Evaluation Committee of JAERI in order to review the R and D subjects to be implemented for five years starting in a 2000 fiscal year at the Center for promotion of Computational Science and Engineering. The review meeting took place on April 26, 1999. According to the review methods consisting of review items, points of review and review criteria given by the Research Evaluation Committee, the review was conducted based on the materials submitted in advance and presentations of CCSE. The Research Evaluation Committee received the review report and its explanations from the Review Committee on July 5. The Research Evaluation Committee has acknowledged appropriateness of the review results. This report describes the review results. (author)

  15. Introductory Programming Subject in European Higher Education

    Science.gov (United States)

    Aleksic, Veljko; Ivanovic, Mirjana

    2016-01-01

    Programming is one of the basic subjects in most informatics, computer science mathematics and technical faculties' curricula. Integrated overview of the models for teaching programming, problems in teaching and suggested solutions were presented in this paper. Research covered current state of 1019 programming subjects in 715 study programmes at…

  16. Understanding initial undergraduate expectations and identity in computing studies

    Science.gov (United States)

    Kinnunen, Päivi; Butler, Matthew; Morgan, Michael; Nylen, Aletta; Peters, Anne-Kathrin; Sinclair, Jane; Kalvala, Sara; Pesonen, Erkki

    2018-03-01

    There is growing appreciation of the importance of understanding the student perspective in Higher Education (HE) at both institutional and international levels. This is particularly important in Science, Technology, Engineering and Mathematics subjects such as Computer Science (CS) and Engineering in which industry needs are high but so are student dropout rates. An important factor to consider is the management of students' initial expectations of university study and career. This paper reports on a study of CS first-year students' expectations across three European countries using qualitative data from student surveys and essays. Expectation is examined from both short-term (topics to be studied) and long-term (career goals) perspectives. Tackling these issues will help paint a picture of computing education through students' eyes and explore their vision of its and their role in society. It will also help educators prepare students more effectively for university study and to improve the student experience.

  17. Quantitative computed tomography determined regional lung mechanics in normal nonsmokers, normal smokers and metastatic sarcoma subjects.

    Directory of Open Access Journals (Sweden)

    Jiwoong Choi

    Full Text Available Extra-thoracic tumors send out pilot cells that attach to the pulmonary endothelium. We hypothesized that this could alter regional lung mechanics (tissue stiffening or accumulation of fluid and inflammatory cells through interactions with host cells. We explored this with serial inspiratory computed tomography (CT and image matching to assess regional changes in lung expansion.We retrospectively assessed 44 pairs of two serial CT scans on 21 sarcoma patients: 12 without lung metastases and 9 with lung metastases. For each subject, two or more serial inspiratory clinically-derived CT scans were retrospectively collected. Two research-derived control groups were included: 7 normal nonsmokers and 12 asymptomatic smokers with two inspiratory scans taken the same day or one year apart respectively. We performed image registration for local-to-local matching scans to baseline, and derived local expansion and density changes at an acinar scale. Welch two sample t test was used for comparison between groups. Statistical significance was determined with a p value < 0.05.Lung regions of metastatic sarcoma patients (but not the normal control group demonstrated an increased proportion of normalized lung expansion between the first and second CT. These hyper-expanded regions were associated with, but not limited to, visible metastatic lung lesions. Compared with the normal control group, the percent of increased normalized hyper-expanded lung in sarcoma subjects was significantly increased (p < 0.05. There was also evidence of increased lung "tissue" volume (non-air components in the hyper-expanded regions of the cancer subjects relative to non-hyper-expanded regions. "Tissue" volume increase was present in the hyper-expanded regions of metastatic and non-metastatic sarcoma subjects. This putatively could represent regional inflammation related to the presence of tumor pilot cell-host related interactions.This new quantitative CT (QCT method for linking

  18. Informatic parcellation of the network involved in the computation of subjective value

    Science.gov (United States)

    Rangel, Antonio

    2014-01-01

    Understanding how the brain computes value is a basic question in neuroscience. Although individual studies have driven this progress, meta-analyses provide an opportunity to test hypotheses that require large collections of data. We carry out a meta-analysis of a large set of functional magnetic resonance imaging studies of value computation to address several key questions. First, what is the full set of brain areas that reliably correlate with stimulus values when they need to be computed? Second, is this set of areas organized into dissociable functional networks? Third, is a distinct network of regions involved in the computation of stimulus values at decision and outcome? Finally, are different brain areas involved in the computation of stimulus values for different reward modalities? Our results demonstrate the centrality of ventromedial prefrontal cortex (VMPFC), ventral striatum and posterior cingulate cortex (PCC) in the computation of value across tasks, reward modalities and stages of the decision-making process. We also find evidence of distinct subnetworks of co-activation within VMPFC, one involving central VMPFC and dorsal PCC and another involving more anterior VMPFC, left angular gyrus and ventral PCC. Finally, we identify a posterior-to-anterior gradient of value representations corresponding to concrete-to-abstract rewards. PMID:23887811

  19. Subjective and objective peer approval evaluations and self-esteem development: A test of reciprocal, prospective, and long-term effects.

    Science.gov (United States)

    Gruenenfelder-Steiger, Andrea E; Harris, Michelle A; Fend, Helmut A

    2016-10-01

    A large body of literature suggests a clear, concurrent association between peer approval and self-esteem in adolescence. However, little empirical work exists on either the prospective or reciprocal relation between peer approval and self-esteem during this age period. Moreover, it is unclear from past research whether both subjectively perceived peer approval and objectively measured peer approval are related to subsequent self-esteem over time (and vice versa) and whether these paths have long-term associations into adulthood. Using data from a large longitudinal study that covers a time span of 2 decades, we examined reciprocal, prospective relations between self-esteem and peer approval during ages 12-16 in addition to long-term relations between these variables and later social constructs at age 35. Cross-lagged regression analyses revealed small but persistent effect sizes from both types of peer approval to subsequent self-esteem in adolescence, controlling for prior self-esteem. However, effects in the reverse direction were not confirmed. These findings support the notion that peer relationships serve an important function for later self-esteem, consistent with many theoretical tenets of the importance of peers for building a strong identity. Finally, we found long-term relations between adult social constructs and adolescent objective and subjective peer approval as well as self-esteem. Therefore, not only do peer relationships play a role in self-esteem development across adolescence, but they remain impactful throughout adulthood. In sum, the current findings highlight the lasting, yet small link between peer relationships and self-esteem development and call for investigations of further influential factors for self-esteem over time. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Reliability of structural systems subject to fatigue

    International Nuclear Information System (INIS)

    Rackwitz, R.

    1984-01-01

    Concepts and computational procedures for the reliability calculation of structural systems subject to fatigue are outlined. Systems are dealt with by approximately computing componential times to first failure. So-called first-order reliability methods are then used to formulate dependencies between componential failures and to evaluate the system failure probability. (Author) [pt

  1. Cranial computed tomography and real-time sonography in full-term neonates and infants

    International Nuclear Information System (INIS)

    Siegel, M.J.; Patel, J.; Gado, M.H.; Shackelford, G.D.

    1983-01-01

    The results of cranial ultrasonography (US) and computed tomography (CT) were compared in 52 full-term neonates and young infants. The chief indications for examination included: increasing head size, dysmorphic features, myelomeningocele, inflammatory disease, and asphyxia. Disorders detected included hydrocephalus, parenchymal abnormalities, intracranial hemorrhage, extraparenchymal fluid collections, and vascular and other developmental malformations. CT and US essentially were equivalent in detecting hydrocephalus, moderate to large intraventricular hemorrhages or subdural collections, and large focal parenchymal lesions, although CT was somewhat better in determining the level and cause of obstruction in patients with hydrocephalus and characterizing parenchymal abnormalities. CT was more sensitive than ultrasound in detecting subarachnoid hemorrhage (100% vs. 0%), diffuse parenchymal abnormality (100% vs. 33%), and small intraventricular hemorrhages (100% vs. 0%) but these lesions often were not clinically significant. The results suggest that US should be used as the primary neuroradiological examination in term infants; CT probably should be reserved for further investigation after US in those patients with a history of hypoxia and progressive clinical deterioration

  2. Do Subjects with Whiplash-Associated Disorders Respond Differently in the Short-Term to Manual Therapy and Exercise than Those with Mechanical Neck Pain?

    Science.gov (United States)

    Castaldo, Matteo; Catena, Antonella; Chiarotto, Alessandro; Fernández-de-Las-Peñas, César; Arendt-Nielsen, Lars

    2017-04-01

    To compare the short-term effects of manual therapy and exercise on pain, related disability, range of motion, and pressure pain thresholds between subjects with mechanical neck pain and whiplash-associated disorders. Twenty-two subjects with mechanical neck pain and 28 with whiplash-associated disorders participated. Clinical and physical outcomes including neck pain intensity, neck-related disability, and pain area, as well as cervical range of motion and pressure pain thresholds over the upper trapezius and tibialis anterior muscles, were obtained at baseline and after the intervention by a blinded assessor. Each subject received six sessions of manual therapy and specific neck exercises. Mixed-model repeated measures analyses of covariance (ANCOVAs) were used for the analyses. Subjects with whiplash-associated disorders exhibited higher neck-related disability ( P  = 0.021), larger pain area ( P  = 0.003), and lower pressure pain thresholds in the tibialis anterior muscle ( P  = 0.009) than those with mechanical neck pain. The adjusted ANCOVA revealed no between-group differences for any outcome (all P  > 0.15). A significant main effect of time was demonstrated for clinical outcomes and cervical range of motion with both groups experiencing similar improvements (all P   0.222). The current clinical trial found that subjects with mechanical neck pain and whiplash-associated disorders exhibited similar clinical and neurophysiological responses after a multimodal physical therapy intervention, suggesting that although greater signs of central sensitization are present in subjects with whiplash-associated disorders, this does not alter the response in the short term to manual therapy and exercises. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  3. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    Science.gov (United States)

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (ptest-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Short-term dehydroepiandrosterone treatment increases platelet cGMP production in elderly male subjects.

    Science.gov (United States)

    Martina, Valentino; Benso, Andrea; Gigliardi, Valentina Ramella; Masha, Andi; Origlia, Carla; Granata, Riccarda; Ghigo, Ezio

    2006-03-01

    Several clinical and population-based studies suggest that dehydroepiandrosterone (DHEA) and its sulphate (DHEA-S) play a protective role against atherosclerosis and coronary artery disease in human. However, the mechanisms underlying this action are still unknown. It has recently been suggested that DHEA-S could delay atheroma formation through an increase in nitric oxide (NO) production. Twenty-four aged male subjects [age (mean +/- SEM): 65.4 +/- 0.7 year; range: 58.2-67.6 years] underwent a blinded placebo controlled study receiving DHEA (50 mg p.o. daily at bedtime) or placebo for 2 months. Platelet cyclic guanosine-monophosphate (cGMP) concentration (as marker of NO production) and serum levels of DHEA-S, DHEA, IGF-I, insulin, glucose, oestradiol (E(2)), testosterone, plasminogen activator inhibitor (PAI)-1 antigen (PAI-1 Ag), homocysteine and lipid profile were evaluated before and after the 2-month treatment with DHEA or placebo. At the baseline, all variables in the two groups were overlapping. All parameters were unchanged after treatment with placebo. Conversely, treatment with DHEA (a) increased (P < 0.001 vs. baseline) platelet cGMP (111.9 +/- 7.1 vs. 50.1 +/- 4.1 fmol/10(6) plts), DHEA-S (13.6 +/- 0.8 vs. 3.0 +/- 0.3 micromol/l), DHEA (23.6 +/- 1.7 vs. 15.3 +/- 1.4 nmol/l), testosterone (23.6 +/- 1.0 vs. 17.7 +/- 1.0 nmol/l) and E(2) (72.0 +/- 5.0 vs. 60.0 +/- 4.0 pmol/l); and (b) decreased (P < 0.05 vs. baseline) PAI-1 Ag (27.4 +/- 3.8 vs. 21.5 +/- 2.5 ng/ml) and low-density lipoprotein (LDL) cholesterol (3.4 +/- 0.2 vs. 3.0 +/- 0.2 mmol/l). IGF-I, insulin, glucose, triglycerides, total cholesterol, HDL cholesterol, HDL2 cholesterol, HDL3 cholesterol, apolipoprotein A1 (ApoA1), apolipoprotein B (ApoB) and homocysteine levels were not modified by DHEA treatment. This study shows that short-term treatment with DHEA increased platelet cGMP production, a marker of NO production, in healthy elderly subjects. This effect is coupled with a decrease in PAI-1

  5. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  6. A statistical observation on some subjects in the whole body computed tomographic examination

    International Nuclear Information System (INIS)

    Murakami, Shozo; Matsumoto, Shigekazu; Murakawa, Yasuhiro; Morimoto, Mitsuo; Nakai, Toshio

    1983-01-01

    Since the whole body CT (computed tomography) unit (GE, CT/T) was installed in our hospital in April, 1982, a total of 2884 cases have been examined by this whole body scanner for one year from April, 1982 to March, 1983. An analysis of the relationship between the situations of the subjects in and the results of whole body CT examination disclosed some very interesting facts. Up to the present time such a study has scarcely made. That is why we wanted to make this report. The results obtained are as follows: 1. Whole body CT examinations were performed on the patients of advanced age more frequently than on those of young age and performed most frequently on the group in the sixties. 2. The number of CT examinations performed on head and abdomen of the patients was 86.7% of a total of 2884 cases. 3. Enhanced CT examinations were perfomed on 26.1% of 2884 cases and most frequently on the group in the teens. 4. The percentage of the abnormal findings found in 2884 cases was 61.5% and this rate was higher than that shown in the reports made by us in 1980 and 1982, respectively. (author)

  7. Two-dimensional speckle-tracking strain echocardiography in long-term heart transplant patients: a study comparing deformation parameters and ejection fraction derived from echocardiography and multislice computed tomography.

    Science.gov (United States)

    Syeda, Bonni; Höfer, Peter; Pichler, Philipp; Vertesich, Markus; Bergler-Klein, Jutta; Roedler, Susanne; Mahr, Stephane; Goliasch, Georg; Zuckermann, Andreas; Binder, Thomas

    2011-07-01

    Longitudinal strain determined by speckle tracking is a sensitive parameter to detect systolic left ventricular dysfunction. In this study, we assessed regional and global longitudinal strain values in long-term heart transplants and compared deformation indices with ejection fraction as determined by transthoracic echocardiography (TTE) and multislice computed tomographic coronary angiography (MSCTA). TTE and MSCTA were prospectively performed in 31 transplant patients (10.6 years post-transplantation) and in 42 control subjects. Grey-scale apical views were recorded for speckle tracking (EchoPAC 7.0, GE) of the 16 segments of the left ventricle. The presence of coronary artery disease (CAD) was assessed by MSCTA. Strain analysis was performed in 1168 segments [496 in transplant patients (42.5%), 672 in control subjects (57.7%)]. Global longitudinal peak systolic strain was significantly lower in the transplant recipients than in the healthy population (-13.9 ± 4.2 vs. -17.4 ± 5.8%, PSimpsons method) was 60.7 ± 10.1%/60.2 ± 6.7% in transplant recipients vs. 64.7 ± 6.4%/63.0 ± 6.2% in the healthy population, P=ns. Even though 'healthy' heart transplants without CAD exhibit normal ejection fraction, deformation indices are reduced in this population when compared with control subjects. Our findings suggests that strain analysis is more sensitive than assessment of ejection fraction for the detection of abnormalities of systolic function.

  8. Treatment outcome and long-term stability of skeletal changes following maxillary distraction in adult subjects of cleft lip and palate.

    Science.gov (United States)

    Singh, Satinder Pal; Jena, Ashok Kumar; Rattan, Vidya; Utreja, Ashok Kumar

    2012-04-01

    To evaluate the treatment outcome and long-term stability of skeletal changes following maxillary advancement with distraction osteogenesis in adult subjects of cleft lip and palate. Total 12 North Indian adult patients in the age range of 17-34 years with cleft lip and palate underwent advancement of maxilla by distraction osteogenesis. Lateral cephalograms recorded prior to distraction, at the end of distraction, 6 months after distraction, and at least 24 months (mean 25.5 ± 1.94 months) after distraction osteogenesis were used for the evaluation of treatment outcome and long-term stability of the skeletal changes. Descriptive analysis, ANOVA, and post-hoc test were used, and P-value 0.05 was considered as a statistically significant level. Maxillary distraction resulted in significant advancement of maxilla (Pmaxillary distraction. The position of the mandible and facial heights were stable during distraction. During the first 6 months of the post-distraction period, the maxilla showed relapse of approximately 30%. However, after 6 months post distraction, the relapse was very negligible. Successful advancement of maxilla was achieved by distraction osteogenesis in adult subjects with cleft lip and palate. Most of the relapse occurred during the first 6 months of post-distraction period, and after that the outcomes were stable.

  9. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  10. Long-term effects of serial anodal tDCS on motion perception in subjects with occipital stroke measured in the unaffected visual hemifield

    Directory of Open Access Journals (Sweden)

    Manuel C Olma

    2013-06-01

    Full Text Available Transcranial direct current stimulation (tDCS is a novel neuromodulatory tool that has seen early transition to clinical trials, although the high variability of these findings necessitates further studies in clincally-relevant populations. The majority of evidence into effects of repeated tDCS is based on research in the human motor system, but it is unclear whether the long-term effects of serial tDCS are motor-specific or transferable to other brain areas. This study aimed to examine whether serial anodal tDCS over the visual cortex can exogenously induce long-term neuroplastic changes in the visual cortex. However, when the visual cortex is affected by a cortical lesion, up-regulated endogenous neuroplastic adaptation processes may alter the susceptibility to tDCS. To this end, motion perception was investigated in the unaffected hemifield of subjects with unilateral visual cortex lesions. Twelve subjects with occipital ischaemic lesions participated in a within-subject, sham-controlled, double-blind study. MRI-registered sham or anodal tDCS (1.5 mA, 20 minutes was applied on five consecutive days over the visual cortex. Motion perception was tested before and after stimulation sessions and at 14- and 28-day follow-up. After a 16-day interval an identical study block with the other stimulation condition (anodal or sham tDCS followed. Serial anodal tDCS over the visual cortex resulted in an improvement in motion perception, a function attributed to MT/V5. This effect was still measurable at 14- and 28-day follow-up measurements. Thus, this may represent evidence for long-term tDCS-induced plasticity and has implications for the design of studies examining the time course of tDCS effects in both the visual and motor systems.

  11. In search of Leonardo: computer-based facial image analysis of Renaissance artworks for identifying Leonardo as subject

    Science.gov (United States)

    Tyler, Christopher W.; Smith, William A. P.; Stork, David G.

    2012-03-01

    One of the enduring mysteries in the history of the Renaissance is the adult appearance of the archetypical "Renaissance Man," Leonardo da Vinci. His only acknowledged self-portrait is from an advanced age, and various candidate images of younger men are difficult to assess given the absence of documentary evidence. One clue about Leonardo's appearance comes from the remark of the contemporary historian, Vasari, that the sculpture of David by Leonardo's master, Andrea del Verrocchio, was based on the appearance of Leonardo when he was an apprentice. Taking a cue from this statement, we suggest that the more mature sculpture of St. Thomas, also by Verrocchio, might also have been a portrait of Leonardo. We tested the possibility Leonardo was the subject for Verrocchio's sculpture by a novel computational technique for the comparison of three-dimensional facial configurations. Based on quantitative measures of similarities, we also assess whether another pair of candidate two-dimensional images are plausibly attributable as being portraits of Leonardo as a young adult. Our results are consistent with the claim Leonardo is indeed the subject in these works, but we need comparisons with images in a larger corpora of candidate artworks before our results achieve statistical significance.

  12. Impact of long-term and short-term therapies on seminal parameters

    Directory of Open Access Journals (Sweden)

    Jlenia Elia

    2013-04-01

    Full Text Available Aim: The aim of this work was: i to evaluate the prevalence of male partners of subfertile couples being treated with long/short term therapies for non andrological diseases; ii to study their seminal profile for the possible effects of their treatments on spermatogenesis and/or epididymal maturation. Methods: The study group was made up of 723 subjects, aged between 25 and 47 years. Semen analysis was performed according to World Health Organization (WHO guidelines (1999. The Superimposed Image Analysis System (SIAS, which is based on the computerized superimposition of spermatozoa images, was used to assess sperm motility parameters. Results: The prevalence of subjects taking pharmacological treatments was 22.7% (164/723. The prevalence was 3.7% (27/723 for the Short-Term Group and 18.9% (137/723 for the Long-Term Group. The subjects of each group were also subdivided into subgroups according to the treatments being received. Regarding the seminal profile, we did not observe a significant difference between the Long-Term, Short-Term or the Control Group. However, regarding the subgroups, we found a significant decrease in sperm number and progressive motility percentage in the subjects receiving treatment with antihypertensive drugs compared with the other subgroups and the Control Group. Conclusions: In the management of infertile couples, the potential negative impact on seminal parameters of any drugs being taken as Long-Term Therapy should be considered. The pathogenic mechanism needs to be clarified.

  13. Improve Outcomes Study subjects Chemistry Teaching and Learning Strategies through independent study with the help of computer-based media

    Science.gov (United States)

    Sugiharti, Gulmah

    2018-03-01

    This study aims to see the improvement of student learning outcomes by independent learning using computer-based learning media in the course of STBM (Teaching and Learning Strategy) Chemistry. Population in this research all student of class of 2014 which take subject STBM Chemistry as many as 4 class. While the sample is taken by purposive as many as 2 classes, each 32 students, as control class and expriment class. The instrument used is the test of learning outcomes in the form of multiple choice with the number of questions as many as 20 questions that have been declared valid, and reliable. Data analysis techniques used one-sided t test and improved learning outcomes using a normalized gain test. Based on the learning result data, the average of normalized gain values for the experimental class is 0,530 and for the control class is 0,224. The result of the experimental student learning result is 53% and the control class is 22,4%. Hypothesis testing results obtained t count> ttable is 9.02> 1.6723 at the level of significance α = 0.05 and db = 58. This means that the acceptance of Ha is the use of computer-based learning media (CAI Computer) can improve student learning outcomes in the course Learning Teaching Strategy (STBM) Chemistry academic year 2017/2018.

  14. Activity-Driven Computing Infrastructure - Pervasive Computing in Healthcare

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Olesen, Anders Konring

    In many work settings, and especially in healthcare, work is distributed among many cooperating actors, who are constantly moving around and are frequently interrupted. In line with other researchers, we use the term pervasive computing to describe a computing infrastructure that supports work...

  15. Internet Use for Health-Care Information by Subjects With COPD.

    Science.gov (United States)

    Delgado, Cionéia K; Gazzotti, Mariana R; Santoro, Ilka L; Carvalho, Andrea K; Jardim, José R; Nascimento, Oliver A

    2015-09-01

    Although the internet is an important tool for entertainment, work, learning, shopping, and communication, it is also a possible source for information on health and disease. The aim of this study was to evaluate the proportion of subjects with COPD in São Paulo, Brazil, who use the internet to obtain information about their disease. Subjects (N = 382) with COPD answered a 17-question survey, including information regarding computer use, internet access, and searching for sites on COPD. Our sample was distributed according to the socioeconomic levels of the Brazilian population (low, 17.8%; medium, 66.5%; and high, 15.7%). Most of the subjects in the sample were male (62.6%), with a mean age of 67.0 ± 9.9 y. According to Global Initiative for Chronic Obstructive Lung Disease (GOLD) stages, 74.3% of the subjects were in stage II or III. In addition, 51.6% of the subjects had a computer, 49.7% accessed the internet, and 13.9% used it to search for information about COPD. The internet was predominantly accessed by male (70.3%) and younger (64.6 ± 9.5 y of age) subjects compared with female (29.7%, P = .04) and older (67.5 ± 9.6 y of age, P internet was associated with having a computer (5.9-fold), Medical Research Council dyspnea level 1 (5.3-fold), and high social class (8.4-fold). The search for information on COPD was not influenced by GOLD staging. A low percentage of subjects with COPD in São Paulo use the internet as a tool to obtain information about their disease. This search is associated with having a computer, low dyspnea score, and high socioeconomic level. Copyright © 2015 by Daedalus Enterprises.

  16. Eye movement analysis of reading from computer displays, eReaders and printed books.

    Science.gov (United States)

    Zambarbieri, Daniela; Carniglia, Elena

    2012-09-01

    To compare eye movements during silent reading of three eBooks and a printed book. The three different eReading tools were a desktop PC, iPad tablet and Kindle eReader. Video-oculographic technology was used for recording eye movements. In the case of reading from the computer display the recordings were made by a video camera placed below the computer screen, whereas for reading from the iPad tablet, eReader and printed book the recording system was worn by the subject and had two cameras: one for recording the movement of the eyes and the other for recording the scene in front of the subject. Data analysis provided quantitative information in terms of number of fixations, their duration, and the direction of the movement, the latter to distinguish between fixations and regressions. Mean fixation duration was different only in reading from the computer display, and was similar for the Tablet, eReader and printed book. The percentage of regressions with respect to the total amount of fixations was comparable for eReading tools and the printed book. The analysis of eye movements during reading an eBook from different eReading tools suggests that subjects' reading behaviour is similar to reading from a printed book. © 2012 The College of Optometrists.

  17. Comparative short-term effects of two thoracic spinal manipulation techniques in subjects with chronic mechanical neck pain: a randomized controlled trial.

    Science.gov (United States)

    Casanova-Méndez, Amaloha; Oliva-Pascual-Vaca, Angel; Rodriguez-Blanco, Cleofás; Heredia-Rizo, Alberto Marcos; Gogorza-Arroitaonandia, Kristobal; Almazán-Campos, Ginés

    2014-08-01

    Spinal Manipulation (SM) has been purported to decrease pain and improve function in subjects with non-specific neck pain. Previous research has investigated which individuals with non-specific neck pain will be more likely to benefit from SM. It has not yet been proven whether or not the effectiveness of thoracic SM depends on the specific technique being used. This double-blind randomized trial has compared the short-term effects of two thoracic SM maneuvers in subjects with chronic non-specific neck pain. Sixty participants were distributed randomly into two groups. One group received the Dog technique (n = 30), with the subject in supine position, and the other group underwent the Toggle-Recoil technique (n = 30), with the participant lying prone, T4 being the targeted area in both cases. Evaluations were made of self-reported neck pain (Visual Analogue Scale); neck mobility (Cervical Range of Motion); and pressure pain threshold at the cervical and thoracic levels (C4 and T4 spinous process) and over the site described for location of tense bands of the upper trapezius muscle. Measurements were taken before intervention, immediately afterward, and 20 min later. Both maneuvers improved neck mobility and mechanosensitivity and reduced pain in the short term. No major or clinical differences were found between the groups. In the between-groups comparison slightly better results were observed in the Toggle-Recoil group only for cervical extension (p = 0.009), right lateral flexion (p = 0.004) and left rotation (p < 0.05). Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Towards Dynamic Remote Data Auditing in Computational Clouds

    Science.gov (United States)

    Khurram Khan, Muhammad; Anuar, Nor Badrul

    2014-01-01

    Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server. PMID:25121114

  19. Towards Dynamic Remote Data Auditing in Computational Clouds

    Directory of Open Access Journals (Sweden)

    Mehdi Sookhak

    2014-01-01

    Full Text Available Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server.

  20. Utility of the computed tomography indices on cone beam computed tomography images in the diagnosis of osteoporosis in women

    International Nuclear Information System (INIS)

    Koh, Kwang Joon; Kim, Kyung A

    2011-01-01

    This study evaluated the potential use of the computed tomography indices (CTI) on cone beam CT (CBCT) images for an assessment of the bone mineral density (BMD) in postmenopausal osteoporotic women. Twenty-one postmenopausal osteoporotic women and 21 postmenopausal healthy women were enrolled as the subjects. The BMD of the lumbar vertebrae and femur were calculated by dual energy X-ray absorptiometry (DXA) using a DXA scanner. The CBCT images were obtained from the unilateral mental foramen region using a PSR-9000N Dental CT system. The axial, sagittal, and coronal images were reconstructed from the block images using OnDemend3D. The new term 'CTI' on CBCT images was proposed. The relationship between the CT measurements and BMDs were assessed and the intra-observer agreement was determined. There were significant differences between the normal and osteoporotic groups in the computed tomography mandibular index superior (CTI(S)), computed tomography mandibular index inferior (CTI(I)), and computed tomography cortical index (CTCI). On the other hand, there was no difference between the groups in the computed tomography mental index (CTMI: inferior cortical width). CTI(S), CTI(I), and CTCI on the CBCT images can be used to assess the osteoporotic women.

  1. Treatment outcome and long-term stability of skeletal changes following maxillary distraction in adult subjects of cleft lip and palate

    OpenAIRE

    Satinder Pal Singh; Ashok Kumar Jena; Vidya Rattan; Ashok Kumar Utreja

    2012-01-01

    Aim : To evaluate the treatment outcome and long-term stability of skeletal changes following maxillary advancement with distraction osteogenesis in adult subjects of cleft lip and palate. Materials and Methods: Total 12 North Indian adult patients in the age range of 17-34 years with cleft lip and palate underwent advancement of maxilla by distraction osteogenesis. Lateral cephalograms recorded prior to distraction, at the end of distraction, 6 months after distraction, and at least 24 month...

  2. Beneficial Effects of Long-Term CPAP Treatment on Sleep Quality and Blood Pressure in Adherent Subjects With Obstructive Sleep Apnea.

    Science.gov (United States)

    Yang, Mei-Chen; Huang, Yi-Chih; Lan, Chou-Chin; Wu, Yao-Kuang; Huang, Kuo-Feng

    2015-12-01

    Obstructive sleep apnea (OSA) is associated with increased risk of cardiovascular diseases. Although CPAP is the first treatment choice for moderate-to-severe OSA, acceptance of and adherence to CPAP remain problematic. High CPAP adherence is generally defined as ≥4 h of use/night for ≥70% of the nights monitored. We investigated the long-term beneficial effects of CPAP on sleep quality and blood pressure in subjects with moderate-to-severe OSA according to high or low CPAP adherence. We retrospectively analyzed 121 subjects with moderate-to-severe OSA from August 2008 to July 2012. These subjects were divided into 3 groups: (1) no CPAP treatment (n = 29), (2) low CPAP adherence (n = 28), and (3) high CPAP adherence (n = 64). All subjects were followed up for at least 1 y. The 3 groups were compared regarding anthropometric and polysomnographic variables, presence of cardiovascular comorbidities, and blood pressure at baseline and at the last follow-up. The no-treatment group showed significant increases in oxygen desaturation index and blood pressure. The high-adherence group showed significant improvement in daytime sleepiness, apnea-hypopnea index (AHI), oxygen desaturation index, and blood pressure. Although the AHI was also significantly decreased after CPAP treatment in the low-adherence group, blood pressure remained unchanged. CPAP treatment had beneficial effects on both sleep quality and blood pressure only in subjects with OSA and high CPAP adherence who used CPAP for ≥4 h/night for ≥70% of nights monitored. Subjects with low CPAP adherence received beneficial effects on AHI, but not blood pressure. Copyright © 2015 by Daedalus Enterprises.

  3. Dry eye syndrome among computer users

    Science.gov (United States)

    Gajta, Aurora; Turkoanje, Daniela; Malaescu, Iosif; Marin, Catalin-Nicolae; Koos, Marie-Jeanne; Jelicic, Biljana; Milutinovic, Vuk

    2015-12-01

    Dry eye syndrome is characterized by eye irritation due to changes of the tear film. Symptoms include itching, foreign body sensations, mucous discharge and transitory vision blurring. Less occurring symptoms include photophobia and eye tiredness. Aim of the work was to determine the quality of the tear film and ocular dryness potential risk in persons who spend more than 8 hours using computers and possible correlations between severity of symptoms (dry eyes symptoms anamnesis) and clinical signs assessed by: Schirmer test I, TBUT (Tears break-up time), TFT (Tear ferning test). The results show that subjects using computer have significantly shorter TBUT (less than 5 s for 56 % of subjects and less than 10 s for 37 % of subjects), TFT type II/III in 50 % of subjects and type III 31% of subjects was found when compared to computer non users (TFT type I and II was present in 85,71% of subjects). Visual display terminal use, more than 8 hours daily, has been identified as a significant risk factor for dry eye. It's been advised to all persons who spend substantial time using computers to use artificial tears drops in order to minimize the symptoms of dry eyes syndrome and prevents serious complications.

  4. Central tarsal bone fractures in horses not used for racing: Computed tomographic configuration and long-term outcome of lag screw fixation

    OpenAIRE

    Gunst, S; Del Chicca, Francesca; Fürst, Anton; Kuemmerle, Jan M

    2016-01-01

    REASONS FOR PERFORMING STUDY: There are no reports on the configuration of equine central tarsal bone fractures based on cross-sectional imaging and clinical and radiographic long-term outcome after internal fixation. OBJECTIVES: To report clinical, radiographic and computed tomographic findings of equine central tarsal bone fractures and to evaluate the long-term outcome of internal fixation. STUDY DESIGN: Retrospective case series. METHODS: All horses diagnosed with a central tarsa...

  5. Subjective and objective outcomes in randomized clinical trials

    DEFF Research Database (Denmark)

    Moustgaard, Helene; Bello, Segun; Miller, Franklin G

    2014-01-01

    explicitly defined the terms. CONCLUSION: The terms "subjective" and "objective" are ambiguous when used to describe outcomes in randomized clinical trials. We suggest that the terms should be defined explicitly when used in connection with the assessment of risk of bias in a clinical trial......OBJECTIVES: The degree of bias in randomized clinical trials varies depending on whether the outcome is subjective or objective. Assessment of the risk of bias in a clinical trial will therefore often involve categorization of the type of outcome. Our primary aim was to examine how the concepts...... "subjective outcome" and "objective outcome" are defined in methodological publications and clinical trial reports. To put this examination into perspective, we also provide an overview of how outcomes are classified more broadly. STUDY DESIGN AND SETTING: A systematic review of methodological publications...

  6. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  7. Concerned with computer games

    DEFF Research Database (Denmark)

    Chimiri, Niklas Alexander; Andersen, Mads Lund; Jensen, Tine

    2018-01-01

    In this chapter, we focus on a particular matter of concern within computer gaming practices: the concern of being or not being a gamer. This matter of concern emerged from within our collective investigations of gaming practices across various age groups. The empirical material under scrutiny...... was generated across a multiplicity of research projects, predominantly conducted in Denmark. The question of being versus not being a gamer, we argue, exemplifies interesting enactments of how computer game players become both concerned with and concerned about their gaming practices. As a collective...... of researchers writing from the field of psychology and inspired by neo-materialist theories, we are particularly concerned with (human) subjectivity and processes of social and subjective becoming. Our empirical examples show that conerns/worries about computer games and being engaged with computer game...

  8. Treatment outcome and long-term stability of skeletal changes following maxillary distraction in adult subjects of cleft lip and palate

    Directory of Open Access Journals (Sweden)

    Satinder Pal Singh

    2012-01-01

    Full Text Available Aim : To evaluate the treatment outcome and long-term stability of skeletal changes following maxillary advancement with distraction osteogenesis in adult subjects of cleft lip and palate. Materials and Methods: Total 12 North Indian adult patients in the age range of 17-34 years with cleft lip and palate underwent advancement of maxilla by distraction osteogenesis. Lateral cephalograms recorded prior to distraction, at the end of distraction, 6 months after distraction, and at least 24 months (mean 25.5 ± 1.94 months after distraction osteogenesis were used for the evaluation of treatment outcome and long-term stability of the skeletal changes. Descriptive analysis, ANOVA, and post-hoc test were used, and P-value 0.05 was considered as a statistically significant level. Results: Maxillary distraction resulted in significant advancement of maxilla (P<0.001. Counterclockwise rotation of the palatal plane took place after maxillary distraction. The position of the mandible and facial heights were stable during distraction. During the first 6 months of the post-distraction period, the maxilla showed relapse of approximately 30%. However, after 6 months post distraction, the relapse was very negligible. Conclusions: Successful advancement of maxilla was achieved by distraction osteogenesis in adult subjects with cleft lip and palate. Most of the relapse occurred during the first 6 months of post-distraction period, and after that the outcomes were stable.

  9. Negative emotion enhances mnemonic precision and subjective feelings of remembering in visual long-term memory.

    Science.gov (United States)

    Xie, Weizhen; Zhang, Weiwei

    2017-09-01

    Negative emotion sometimes enhances memory (higher accuracy and/or vividness, e.g., flashbulb memories). The present study investigates whether it is the qualitative (precision) or quantitative (the probability of successful retrieval) aspect of memory that drives these effects. In a visual long-term memory task, observers memorized colors (Experiment 1a) or orientations (Experiment 1b) of sequentially presented everyday objects under negative, neutral, or positive emotions induced with International Affective Picture System images. In a subsequent test phase, observers reconstructed objects' colors or orientations using the method of adjustment. We found that mnemonic precision was enhanced under the negative condition relative to the neutral and positive conditions. In contrast, the probability of successful retrieval was comparable across the emotion conditions. Furthermore, the boost in memory precision was associated with elevated subjective feelings of remembering (vividness and confidence) and metacognitive sensitivity in Experiment 2. Altogether, these findings suggest a novel precision-based account for emotional memories. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Functional diversity of Collembola is reduced in soils subjected to short-term, but not long-term, geothermal warming

    DEFF Research Database (Denmark)

    Holmstrup, Martin; Ehlers, Bodil K.; Slotsbo, Stine

    2018-01-01

    the extent of such effects in long-term field-based experiments. In this study we make use of both recent (short-term) and long-term geothermal warming of Icelandic soils to examine the responses of Collembola, an ecologically important group of soil invertebrates, to warming. 2. On the basis of metabolic...

  11. The Nature of Computational Thinking in Computing Education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid; Brynskov, Martin

    2018-01-01

    Computational Thinking has gained popularity in recent years within educational and political discourses. It is more than ever crucial to discuss the term itself and what it means. In June 2017, Denning articulated that computational thinking can be viewed as either “traditional” or “new”. New...

  12. Developing Long-Term Computing Skills among Low-Achieving Students via Web-Enabled Problem-Based Learning and Self-Regulated Learning

    Science.gov (United States)

    Tsai, Chia-Wen; Lee, Tsang-Hsiung; Shen, Pei-Di

    2013-01-01

    Many private vocational schools in Taiwan have taken to enrolling students with lower levels of academic achievement. The authors re-designed a course and conducted a series of quasi-experiments to develop students' long-term computing skills, and examined the longitudinal effects of web-enabled, problem-based learning (PBL) and self-regulated…

  13. Computer Access and Flowcharting as Variables in Learning Computer Programming.

    Science.gov (United States)

    Ross, Steven M.; McCormick, Deborah

    Manipulation of flowcharting was crossed with in-class computer access to examine flowcharting effects in the traditional lecture/laboratory setting and in a classroom setting where online time was replaced with manual simulation. Seventy-two high school students (24 male and 48 female) enrolled in a computer literacy course served as subjects.…

  14. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    Science.gov (United States)

    Wu, Dongrui; Lance, Brent J; Parsons, Thomas D

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  15. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    Directory of Open Access Journals (Sweden)

    Dongrui Wu

    Full Text Available Brain-computer interaction (BCI and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL, active class selection (ACS, and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  16. Computer-aided diagnosis in chest radiology.

    Science.gov (United States)

    MacMahon, H; Doi, K; Chan, H P; Giger, M L; Katsuragawa, S; Nakamori, N

    1990-01-01

    Digital radiography offers several important advantages over conventional systems, including abilities for image manipulation, transmission, and storage. In the long term, however, the unique ability to apply artificial intelligence techniques for automated detection and quantitation of disease may have an even greater impact on radiologic practice. Although CAD is still in its infancy, the results of several recent studies clearly indicate a major potential for the future. The concept of using computers to analyze medical images is not new, but recent advances in computer technology together with progress in implementing practical digital radiography systems have stimulated research efforts in this exciting field. Several facets of CAD are presently being developed at the University of Chicago and elsewhere for application in chest radiology as well as in mammography and vascular imaging. To date, investigators have focused on a limited number of subjects that have been, by their nature, particularly suitable for computer analysis. There is no aspect of radiologic diagnosis that could not potentially benefit from this approach, however. The ultimate goal of these endeavors is to provide a system for comprehensive automated image analysis, the results of which could be accepted or modified at the discretion of the radiologist.

  17. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  18. Use of Debye's series to determine the optimal edge-effect terms for computing the extinction efficiencies of spheroids.

    Science.gov (United States)

    Lin, Wushao; Bi, Lei; Liu, Dong; Zhang, Kejun

    2017-08-21

    The extinction efficiencies of atmospheric particles are essential to determining radiation attenuation and thus are fundamentally related to atmospheric radiative transfer. The extinction efficiencies can also be used to retrieve particle sizes or refractive indices through particle characterization techniques. This study first uses the Debye series to improve the accuracy of high-frequency extinction formulae for spheroids in the context of Complex angular momentum theory by determining an optimal number of edge-effect terms. We show that the optimal edge-effect terms can be accurately obtained by comparing the results from the approximate formula with their counterparts computed from the invariant imbedding Debye series and T-matrix methods. An invariant imbedding T-matrix method is employed for particles with strong absorption, in which case the extinction efficiency is equivalent to two plus the edge-effect efficiency. For weakly absorptive or non-absorptive particles, the T-matrix results contain the interference between the diffraction and higher-order transmitted rays. Therefore, the Debye series was used to compute the edge-effect efficiency by separating the interference from the transmission on the extinction efficiency. We found that the optimal number strongly depends on the refractive index and is relatively insensitive to the particle geometry and size parameter. By building a table of optimal numbers of edge-effect terms, we developed an efficient and accurate extinction simulator that has been fully tested for randomly oriented spheroids with various aspect ratios and a wide range of refractive indices.

  19. Operational characteristics optimization of human-computer system

    Directory of Open Access Journals (Sweden)

    Zulquernain Mallick

    2010-09-01

    Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

  20. The effect of reduced atmospheric deposition on soil and soil solution chemistry at a site subjected to long-term acidification, Nacetin, Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Oulehle, F.; Hofmeister, J.; Cudlín, Pavel; Hruška, J.

    2006-01-01

    Roč. 370, 2-3 (2006), s. 532-544 ISSN 0048-9697 R&D Projects: GA ČR(CZ) GA526/03/0058 Institutional research plan: CEZ:AV0Z60870520 Keywords : long-term monitoring Norway spruce * Recovery * Soil solution * Base cations * Nitrogen * Norway spruce Subject RIV: DD - Geochemistry Impact factor: 2.359, year: 2006

  1. Development of computer-based function to estimate radioactive source term by coupling atmospheric model with monitoring data

    International Nuclear Information System (INIS)

    Akiko, Furuno; Hideyuki, Kitabata

    2003-01-01

    Full text: The importance of computer-based decision support systems for local and regional scale accidents has been recognized by many countries with the experiences of accidental atmospheric releases of radionuclides at Chernobyl in 1986 in the former Soviet Union. The recent increase of nuclear power plants in the Asian region also necessitates an emergency response system for Japan to predict the long-range atmospheric dispersion of radionuclides due to overseas accident. On the basis of these backgrounds, WSPEEDI (Worldwide version of System for Prediction of Environmental Emergency Dose Information) at Japan Atomic Energy Research Institute is developed to forecast long-range atmospheric dispersions of radionuclides during nuclear emergency. Although the source condition is critical parameter for accurate prediction, it is rarely that the condition can be acquired in the early stage of overseas accident. Thus, we have been developing a computer-based function to estimate radioactive source term, e.g. the release point, time and amount, as a part of WSPEEDI. This function consists of atmospheric transport simulations and statistical analysis for the prediction and monitoring of air dose rates. Atmospheric transport simulations are carried out for the matrix of possible release points in Eastern Asia and possible release times. The simulation results of air dose rates are compared with monitoring data and the best fitted release condition is defined as source term. This paper describes the source term estimation method and the application to Eastern Asia. The latest version of WSPEEDI accommodates following two models: an atmospheric meteorological model MM5 and a particle random walk model GEARN. MM5 is a non-hydrostatic meteorological model developed by the Pennsylvania State University and the National Center for Atmospheric Research (NCAR). MM5 physically calculates more than 40 meteorological parameters with high resolution in time and space based an

  2. Short-term changes in temporomandibular joint function in subjects with cleft lip and palate treated with maxillary distraction osteogenesis.

    Science.gov (United States)

    Hashimoto, K; Otsuka, R; Minato, A; Sato-Wakabayashi, M; Takada, J; Inoue-Arai, M S; Miyamoto, J J; Ono, T; Ohyama, K; Moriyama, K

    2008-05-01

    To investigate the short-term effects of maxillary distraction osteogenesis (DO) on temporomandibular joint (TMJ) function in 21 subjects with cleft lip and palate (CLP). Design - Morphological changes in the maxillofacial region were measured using lateral cephalometric radiographs taken immediately before (pre-DO) and after DO (post-DO) and 1 year after DO (1-year follow-up). A questionnaire was evaluated using a visual analog scale. A chi-square test was used to compare the prevalence of TMJ symptoms between pre-DO and 1-year follow-up. The Spearman correlation coefficient was used to determine the correlation between changes in cephalometric variables and TMJ symptoms in association with maxillary DO. Statistical significance was set at p < 0.05. Results - The ANB (anteroposterior relationship of the maxilla with the mandible) angle and the mandibular plane angle at pre-DO, post-DO, and 1-year follow-up were -4.3 degrees , +5.8 degrees , +4.3 degrees and 32.1 degrees , 33.5 degrees , 33.6 degrees , respectively. The average amounts of anterior and downward movement of the maxilla at post-DO and 1-year follow-up were 8.3, -1.3 and 0.9, 1.1 mm, respectively. The prevalence of TMJ symptoms showed no significant increase in association with maxillary DO. Moreover, there was no significant correlation between changes in cephalometric variables and TMJ symptoms. Conclusion - These results suggest that there was no short-term (i.e., up to 1 year after DO) effect of maxillary DO on TMJ function in subjects with CLP.

  3. Controlled Terms or Free Terms? A JavaScript Library to Utilize Subject Headings and Thesauri on the Web

    Directory of Open Access Journals (Sweden)

    Shun Nagaya

    2011-10-01

    Full Text Available There are two types of keywords used as metadata: controlled terms and free terms. Free terms have the advantage that metadata creators can freely select keywords, but there also exists a disadvantage that the information retrieval recall ratio might be reduced. The recall ratio can be improved by using controlled terms. But creating and maintaining controlled vocabularies has an enormous cost. In addition, many existing controlled vocabularies are published in formats less suitable for programming. We introduce a JavaScript library called “covo.js” that enables us to make use of controlled vocabularies as metadata for the organization of web pages.

  4. Effectiveness of Using Computer-Assisted Supplementary Instruction for Teaching the Mole Concept

    Science.gov (United States)

    Yalçinalp, Serpil; Geban, Ömer; Özkan, Ilker

    This study examined the effect of computer-assisted instruction (CAI), used as a problem-solving supplement to classroom instruction, on students' understanding of chemical formulas and mole concept, their attitudes toward chemistry subjects, and CAI. The objective was to assess the effectiveness of CAI over recitation hours when both teaching methods were used as a supplement to the traditional chemistry instruction. We randomly selected two classes in a secondary school. Each teaching strategy was randomly assigned to one class. The experimental group received supplementary instruction delivered via CAI, while the control group received similar instruction through recitation hours. The data were analyzed using two-way analysis of variance and t-test. It was found that the students who used the CAI accompanied with lectures scored significantly higher than those who attended recitation hours, in terms of school subject achievement in chemistry and attitudes toward chemistry subjects. In addition, there was a significant improvement in the attitudes of students in the experimental group toward the use of computers in a chemistry course. There was no significant difference between the performances of females versus males in each treatment group.Received: 26 April 1994; Revised: 6 April 1995;

  5. Training-induced changes in muscle CSA,muscle strength, EMG and rate of force development in elderly subjects after long-term unilateral disuse

    DEFF Research Database (Denmark)

    Suetta, Charlotte; Aagaard, Per; Rosted, Anne

    2004-01-01

    , maximal isometric strength, RFD, and muscle activation in elderly men and women recovering from long-term muscle disuse and subsequent hip surgery. The improvement in both muscle mass and neural function is likely to have important functional implications for elderly individuals........ Thirty subjects completed the trial. In the strength-training group, significant increases were observed in maximal isometric muscle strength (24%, P impulse (27-32%, P

  6. Computer science handbook. Vol. 13.3. Environmental computer science. Computer science methods for environmental protection and environmental research

    International Nuclear Information System (INIS)

    Page, B.; Hilty, L.M.

    1994-01-01

    Environmental computer science is a new partial discipline of applied computer science, which makes use of methods and techniques of information processing in environmental protection. Thanks to the inter-disciplinary nature of environmental problems, computer science acts as a mediator between numerous disciplines and institutions in this sector. The handbook reflects the broad spectrum of state-of-the art environmental computer science. The following important subjects are dealt with: Environmental databases and information systems, environmental monitoring, modelling and simulation, visualization of environmental data and knowledge-based systems in the environmental sector. (orig.) [de

  7. Safety I and C system platforms - State-of-the-art and long-term available - A contradiction in terms?

    International Nuclear Information System (INIS)

    Richter, Steffen; Martin, Michael

    2006-01-01

    Automation systems, particularly in the field of safety I and C, are subject to conflict between three challenges. Customers' requests for state-of-the-art technology, ever shorter innovation cycles in the electronics industry and computer business and the requirement for long-term spare parts supply demand thorough and sustainable concepts from the supply market. The TELEPERM XS digital safety I and C platform has been applied successfully since 1998 for the modernization of safety I and C systems in over 30 NPP units from different reactor suppliers as well as for new plant construction. The platform is subject to a forward-looking life cycle management program combining an evolutionary and future-oriented approach to platform development with measures for ensuring the long-term support of the installed base. Driven by ever shorter innovation cycles in the electronics and automation industry, the platform is continuously evolved with state-of-the-art technology and enhanced safety features. The continuous innovation process is combined with maximum compatibility of the I and C components that make up the TELEPERM XS system platform. This makes the system future-oriented and simultaneously assures long-term availability of replacement parts. In this way TELEPERM XS meets the customer requirements for up-to-date but proven technology suitable to ensure an operating life of safety I and C equipment spanning several decades. As a matter of course, the platform and component development adheres to the robust and proven architecture of TELEPERM XS, thereby limiting risks for equipment qualification and project licensing to a minimum. (authors)

  8. Safety I and C system platforms - State-of-the-art and long-term available - A contradiction in terms?

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Steffen; Martin, Michael [Framatome ANP GmbH, P.O. Box 3220, Freyeslebenstrasse 1, D-91050 Erlangen (Germany)

    2006-07-01

    Automation systems, particularly in the field of safety I and C, are subject to conflict between three challenges. Customers' requests for state-of-the-art technology, ever shorter innovation cycles in the electronics industry and computer business and the requirement for long-term spare parts supply demand thorough and sustainable concepts from the supply market. The TELEPERM XS digital safety I and C platform has been applied successfully since 1998 for the modernization of safety I and C systems in over 30 NPP units from different reactor suppliers as well as for new plant construction. The platform is subject to a forward-looking life cycle management program combining an evolutionary and future-oriented approach to platform development with measures for ensuring the long-term support of the installed base. Driven by ever shorter innovation cycles in the electronics and automation industry, the platform is continuously evolved with state-of-the-art technology and enhanced safety features. The continuous innovation process is combined with maximum compatibility of the I and C components that make up the TELEPERM XS system platform. This makes the system future-oriented and simultaneously assures long-term availability of replacement parts. In this way TELEPERM XS meets the customer requirements for up-to-date but proven technology suitable to ensure an operating life of safety I and C equipment spanning several decades. As a matter of course, the platform and component development adheres to the robust and proven architecture of TELEPERM XS, thereby limiting risks for equipment qualification and project licensing to a minimum. (authors)

  9. Computer science in Dutch secondary education: independent or integrated?

    NARCIS (Netherlands)

    van der Sijde, Peter; Doornekamp, B.G.

    1992-01-01

    Nowadays, in Dutch secondary education, computer science is integrated within school subjects. About ten years ago computer science was considered an independent subject, but in the mid-1980s this idea changed. In our study we investigated whether the objectives of teaching computer science as an

  10. Subjective Organization Calculator for Free Recall

    Directory of Open Access Journals (Sweden)

    Olesya Senkova

    2015-11-01

    Full Text Available The free recall measure has an advantage over other memory measures because the free recall measure can provide organization measures, which can reveal the strategies participants used to maximize recall. For instance, even when a study list does not show a clear organizational scheme, recall outputs are often far from random, evidenced by participants recalling the same two or more items together repeatedly across multiple test trials. Unfortunately, computing organizational measures is laborious. The present article introduces a calculator to compute subjective organization (SO measures. The calculator is based on a popular platform accessible to most researchers and is designed to compute commonly used SO measures for each participant.

  11. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    Science.gov (United States)

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  12. Ergonomic evaluation of subjects involved in orange ( Citrus sinensis )

    African Journals Online (AJOL)

    Ergonomic evaluation of subjects involved in orange handling operation in Kano State was conducted. Anthropometric parameters were evaluated, where they were found to vary with age amongst the subjects selected. 20th and 80th percentiles of the dimensions were computed and recommended for usage in design of ...

  13. Computer Self-Efficacy, Computer Anxiety, Performance and Personal Outcomes of Turkish Physical Education Teachers

    Science.gov (United States)

    Aktag, Isil

    2015-01-01

    The purpose of this study is to determine the computer self-efficacy, performance outcome, personal outcome, and affect and anxiety level of physical education teachers. Influence of teaching experience, computer usage and participation of seminars or in-service programs on computer self-efficacy level were determined. The subjects of this study…

  14. Redox proteomics and physiological responses in Cistus albidus shrubs subjected to long-term summer drought followed by recovery.

    Science.gov (United States)

    Brossa, Ricard; Pintó-Marijuan, Marta; Francisco, Rita; López-Carbonell, Marta; Chaves, Maria Manuela; Alegre, Leonor

    2015-04-01

    The interaction between enzymatic and non-enzymatic antioxidants, endogenous levels of ABA and ABA-GE, the rapid recuperation of photosynthetic proteins under re-watering as well the high level of antioxidant proteins in previously drought-stressed plants under re-watering conditions, will contribute to drought resistance in plants subjected to a long-term drought stress under Mediterranean field conditions. This work provides an overview of the mechanisms of Cistus albidus acclimation to long-term summer drought followed by re-watering in Mediterranean field conditions. To better understand the molecular mechanisms of drought resistance in these plants, a proteomic study using 2-DE and MALDI-TOF/TOF MS/MS was performed on leaves from these shrubs. The analysis identified 57 differentially expressed proteins in water-stressed plants when contrasted to well watered. Water-stressed plants showed an increase, both qualitatively and quantitatively, in HSPs, and downregulation of photosynthesis and carbon metabolism enzymes. Under drought conditions, there was considerable upregulation of enzymes related to redox homeostasis, DHA reductase, Glyoxalase, SOD and isoflavone reductase. However, upregulation of catalase was not observed until after re-watering was carried out. Drought treatment caused an enhancement in antioxidant defense responses that can be modulated by ABA, and its catabolites, ABA-GE, as well as JA. Furthermore, quantification of protein carbonylation was shown to be a useful marker of the relationship between water and oxidative stress, and showed that there was only moderate oxidative stress in C. albidus plants subjected to water stress. After re-watering plants recovered although the levels of ABA-GE and antioxidant enzymes still remain higher than in well-watered plants. We expect that our results will provide new data on summer acclimation to drought stress in Mediterranean shrubs.

  15. Computer and video game addiction-a comparison between game users and non-game users.

    Science.gov (United States)

    Weinstein, Aviv Malkiel

    2010-09-01

    Computer game addiction is excessive or compulsive use of computer and video games that may interfere with daily life. It is not clear whether video game playing meets diagnostic criteria for Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV). First objective is to review the literature on computer and video game addiction over the topics of diagnosis, phenomenology, epidemiology, and treatment. Second objective is to describe a brain imaging study measuring dopamine release during computer game playing. Article search of 15 published articles between 2000 and 2009 in Medline and PubMed on computer and video game addiction. Nine abstinent "ecstasy" users and 8 control subjects were scanned at baseline and after performing on a motorbike riding computer game while imaging dopamine release in vivo with [123I] IBZM and single photon emission computed tomography (SPECT). Psycho-physiological mechanisms underlying computer game addiction are mainly stress coping mechanisms, emotional reactions, sensitization, and reward. Computer game playing may lead to long-term changes in the reward circuitry that resemble the effects of substance dependence. The brain imaging study showed that healthy control subjects had reduced dopamine D2 receptor occupancy of 10.5% in the caudate after playing a motorbike riding computer game compared with baseline levels of binding consistent with increased release and binding to its receptors. Ex-chronic "ecstasy" users showed no change in levels of dopamine D2 receptor occupancy after playing this game. This evidence supports the notion that psycho-stimulant users have decreased sensitivity to natural reward. Computer game addicts or gamblers may show reduced dopamine response to stimuli associated with their addiction presumably due to sensitization.

  16. Iterative reconstruction techniques for computed tomography Part 1: Technical principles

    International Nuclear Information System (INIS)

    Willemink, Martin J.; Jong, Pim A. de; Leiner, Tim; Nievelstein, Rutger A.J.; Schilham, Arnold M.R.; Heer, Linda M. de; Budde, Ricardo P.J.

    2013-01-01

    To explain the technical principles of and differences between commercially available iterative reconstruction (IR) algorithms for computed tomography (CT) in non-mathematical terms for radiologists and clinicians. Technical details of the different proprietary IR techniques were distilled from available scientific articles and manufacturers' white papers and were verified by the manufacturers. Clinical results were obtained from a literature search spanning January 2006 to January 2012, including only original research papers concerning IR for CT. IR for CT iteratively reduces noise and artefacts in either image space or raw data, or both. Reported dose reductions ranged from 23 % to 76 % compared to locally used default filtered back-projection (FBP) settings, with similar noise, artefacts, subjective, and objective image quality. IR has the potential to allow reducing the radiation dose while preserving image quality. Disadvantages of IR include blotchy image appearance and longer computational time. Future studies need to address differences between IR algorithms for clinical low-dose CT. circle Iterative reconstruction technology for CT is presented in non-mathematical terms. (orig.)

  17. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  18. A subtraction scheme for computing QCD jet cross sections at NNLO. Integrating the iterated singly-unresolved subtraction terms

    Energy Technology Data Exchange (ETDEWEB)

    Bolzoni, Paolo [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik; Somogyi, Gabor [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Trocsanyi, Zoltan [Debrecen Univ. (Hungary); Hungarian Academy of Sciences, Debrecen (Hungary). Inst. of Nuclear Research

    2010-11-15

    We perform the integration of all iterated singly-unresolved subtraction terms over the two-particle factorized phase space. We also sum over the unresolved parton flavours. The final result can be written as a convolution (in colour space) of the Born cross section and an insertion operator. We spell out the insertion operator in terms of 24 basic integrals that are defined explicitly. We compute the coefficients of the Laurent-expansion of these integrals in two different ways, with the method of Mellin-Barnes representations and sector decomposition. Finally, we present the Laurentexpansion of the full insertion operator for the specific examples of electron-positron annihilation into two and three jets. (orig.)

  19. A subtraction scheme for computing QCD jet cross sections at NNLO: integrating the iterated singly-unresolved subtraction terms

    Science.gov (United States)

    Bolzoni, Paolo; Somogyi, Gábor; Trócsányi, Zoltán

    2011-01-01

    We perform the integration of all iterated singly-unresolved subtraction terms, as defined in ref. [1], over the two-particle factorized phase space. We also sum over the unresolved parton flavours. The final result can be written as a convolution (in colour space) of the Born cross section and an insertion operator. We spell out the insertion operator in terms of 24 basic integrals that are defined explicitly. We compute the coefficients of the Laurent expansion of these integrals in two different ways, with the method of Mellin-Barnes representations and sector decomposition. Finally, we present the Laurent-expansion of the full insertion operator for the specific examples of electron-positron annihilation into two and three jets.

  20. Brain Single Photon Emission Computed Tomography in Anosmic Subjects Ater Closed Head Trauma

    Directory of Open Access Journals (Sweden)

    Roozbeh Banan

    2011-01-01

    Full Text Available Anosmia following head trauma is relatively common and in many cases is persistent and irreversible. The ability to objectively measure such a decline in smelling, for both clinical and medicolegal goals, is very important. The aim of this study was to find results of brain Single Photon Emission Computed Tomography (SPECT in anosmic subjects after closed head trauma. This case-control cross sectional study was conducted in a tertiary referral University Hospital. The brain perfusion state of nineteen anosmic patients and thirteen normal controls was evaluated by means of the SPECT with 99mtc- ECD infusion- before and after olfactory stimulation. The orbitofrontal lobe of the brain was assumed as the region of interest and changes in perfusion of this area before and after the stimulations were compared in two groups. The mean of brain perfusion in controls before and after the stimulation was 8.26% ± 0.19% and 9.89% ± 0.54%, respectively (P < 0.0001. Among patients group, these quantities were 7.97% ± 1.05% and 8.49% ± 1.5%, respectively (P < 0.004. The difference between all the measures in cases and controls were statistically significant (P < 0.0001. There were no differences in age and sex between two groups. The brain SPECT is an objective technique suitable for evaluating anosmia following the head trauma and it may be used with other diagnostic modalities

  1. Usability of Three Electroencephalogram Headsets for Brain-Computer Interfaces: A Within Subject Comparison

    NARCIS (Netherlands)

    Gamboa, H.; Nijboer, Femke; van de Laar, B.L.A.; Plácido da Silva, H.; Gilleade, K.; Gerritsen, Steven; Nijholt, Antinus; Bermúdez i Badia, S.; Poel, Mannes; Fairclough, S.

    Currently the field of brain–computer interfacing is increasingly focused on developing usable brain–computer interfaces (BCIs) to better ensure technology transfer and acceptance. Many studies have investigated the usability of BCI applications as a whole. Here we aim to investigate one specific

  2. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    Energy Technology Data Exchange (ETDEWEB)

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  3. 20 CFR 226.52 - Total annuity subject to maximum.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Total annuity subject to maximum. 226.52... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Railroad Retirement Family Maximum § 226.52 Total annuity subject to maximum. The total annuity amount which is compared to the maximum monthly amount to...

  4. Optimizing the Usability of Brain-Computer Interfaces.

    Science.gov (United States)

    Zhang, Yin; Chase, Steve M

    2018-03-22

    Brain-computer interfaces are in the process of moving from the laboratory to the clinic. These devices act by reading neural activity and using it to directly control a device, such as a cursor on a computer screen. An open question in the field is how to map neural activity to device movement in order to achieve the most proficient control. This question is complicated by the fact that learning, especially the long-term skill learning that accompanies weeks of practice, can allow subjects to improve performance over time. Typical approaches to this problem attempt to maximize the biomimetic properties of the device in order to limit the need for extensive training. However, it is unclear if this approach would ultimately be superior to performance that might be achieved with a nonbiomimetic device once the subject has engaged in extended practice and learned how to use it. Here we approach this problem using ideas from optimal control theory. Under the assumption that the brain acts as an optimal controller, we present a formal definition of the usability of a device and show that the optimal postlearning mapping can be written as the solution of a constrained optimization problem. We then derive the optimal mappings for particular cases common to most brain-computer interfaces. Our results suggest that the common approach of creating biomimetic interfaces may not be optimal when learning is taken into account. More broadly, our method provides a blueprint for optimal device design in general control-theoretic contexts.

  5. Power Spectral Analysis of Short-Term Heart Rate Variability in Healthy and Arrhythmia Subjects by the Adaptive Continuous Morlet Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ram Sewak SINGH

    2017-12-01

    Full Text Available Power spectral analysis of short-term heart rate variability (HRV can provide instant valuable information to understand the functioning of autonomic control over the cardiovascular system. In this study, an adaptive continuous Morlet wavelet transform (ACMWT method has been used to describe the time-frequency characteristics of the HRV using band power spectra and the median value of interquartile range. Adaptation of the method was based on the measurement of maximum energy concentration. The ACMWT has been validated on synthetic signals (i.e. stationary, non-stationary as slow varying and fast changing frequency with time modeled as closest to dynamic changes in HRV signals. This method has been also tested in the presence of additive white Gaussian noise (AWGN to show its robustness towards the noise. From the results of testing on synthetic signals, the ACMWT was found to be an enhanced energy concentration estimator for assessment of power spectral of short-term HRV time series compared to adaptive Stockwell transform (AST, adaptive modified Stockwell transform (AMST, standard continuous Morlet wavelet transform (CMWT and Stockwell transform (ST estimators at statistical significance level of 5%. Further, the ACMWT was applied to real HRV data from Fantasia and MIT-BIH databases, grouped as healthy young group (HYG, healthy elderly group (HEG, arrhythmia controlled medication group (ARCMG, and supraventricular tachycardia group (SVTG subjects. The global results demonstrate that spectral indices of low frequency power (LFp and high frequency power (HFp of HRV were decreased in HEG compared to HYG subjects (p<0.0001. While LFp and HFp indices were increased in ARCMG compared to HEG (p<0.00001. The LFp and HFp components of HRV obtained from SVTG were reduced compared to other group subjects (p<0.00001.

  6. Chemical evolution and the origin of life: cumulative keyword subject index 1970-1986

    Science.gov (United States)

    Roy, A. C.; Powers, J. V.; Rummel, J. D. (Principal Investigator)

    1990-01-01

    This cumulative subject index encompasses the subject indexes of the bibliographies on Chemical Evolution and the Origin of Life that were first published in 1970 and have continued through publication of the 1986 bibliography supplement. Early bibliographies focused on experimental and theoretical material dealing directly with the concepts of chemical evolution and the origin of life, excluding the broader areas of exobiology, biological evolution, and geochemistry. In recent years, these broader subject areas have also been incorporated as they appear in literature searches relating to chemical evolution and the origin of life, although direct attempts have not been made to compile all of the citations in these broad areas. The keyword subject indexes have also undergone an analogous change in scope. Compilers of earlier bibliographies used the most specific term available in producing the subject index. Compilers of recent bibliographies have used a number of broad terms relating to the overall subject content of each citation and specific terms where appropriate. The subject indexes of these 17 bibliographies have, in general, been cumulatively compiled exactly as they originally appeared. However, some changes have been made in an attempt to correct errors, combine terms, and provide more meaningful terms.

  7. TRING: a computer program for calculating radionuclide transport in groundwater

    International Nuclear Information System (INIS)

    Maul, P.R.

    1984-12-01

    The computer program TRING is described which enables the transport of radionuclides in groundwater to be calculated for use in long term radiological assessments using methods described previously. Examples of the areas of application of the program are activity transport in groundwater associated with accidental spillage or leakage of activity, the shutdown of reactors subject to delayed decommissioning, shallow land burial of intermediate level waste and geologic disposal of high level waste. Some examples of the use of the program are given, together with full details to enable users to run the program. (author)

  8. Objective and subjective sleep quality

    DEFF Research Database (Denmark)

    Baandrup, Lone; Glenthøj, Birte Yding; Jennum, Poul Jørgen

    2016-01-01

    and subjective sleep quality during benzodiazepine discontinuation and whether sleep variables were associated with benzodiazepine withdrawal. Eligible patients included adults with a diagnosis of schizophrenia, schizoaffective disorder, or bipolar disorder and long-term use of benzodiazepines in combination...

  9. Fel d 1-derived synthetic peptide immuno-regulatory epitopes show a long-term treatment effect in cat allergic subjects.

    Science.gov (United States)

    Couroux, P; Patel, D; Armstrong, K; Larché, M; Hafner, R P

    2015-05-01

    Cat-PAD, the first in a new class of synthetic peptide immuno-regulatory epitopes (SPIREs), was shown to significantly improve rhinoconjunctivitis symptoms in subjects with cat allergy up to 1 year after the start of a short course of treatment. To evaluate the long-term effects of Cat-PAD on rhinoconjunctivitis symptoms following standardized allergen challenge 2 years after treatment. In a randomized, double-blind, placebo-controlled, parallel group study, subjects were exposed to cat allergen in an environmental exposure chamber (EEC) before and after treatment with two regimens of Cat-PAD (either eight doses of 3 nmol or four doses of 6 nmol) given intradermally over a 3-month period. In this follow-up study, changes from baseline in rhinoconjunctivitis symptoms were reassessed 2 years after the start of treatment. The primary endpoint showed a mean reduction in total rhinoconjunctivitis symptom scores of 3.85 units in the 4 × 6 nmol Cat-PAD group compared to placebo 2 years after the start of treatment (P = 0.13), and this difference was statistically significant in the secondary endpoint at the end of day 4 when the cumulative allergen challenge was greatest (P = 0.02). Consistent reductions in nasal symptoms of between 2 and 3 units were observed for 4 × 6 nmol Cat-PAD compared to placebo between the 2 and 3 h time points on days 1-4 of EEC challenge at 2 years (P Cat-PAD. This study is the first to provide evidence of a long-term therapeutic effect with this new class of SPIREs. © 2015 The Authors. Clinical & Experimental Allergy Published by John Wiley & Sons Ltd.

  10. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  11. Complications with computer-aided designed/computer-assisted manufactured titanium and soldered gold bars for mandibular implant-overdentures: short-term observations.

    Science.gov (United States)

    Katsoulis, Joannis; Wälchli, Julia; Kobel, Simone; Gholami, Hadi; Mericske-Stern, Regina

    2015-01-01

    Implant-overdentures supported by rigid bars provide stability in the edentulous atrophic mandible. However, fractures of solder joints and matrices, and loosening of screws and matrices were observed with soldered gold bars (G-bars). Computer-aided designed/computer-assisted manufactured (CAD/CAM) titanium bars (Ti-bars) may reduce technical complications due to enhanced material quality. To compare prosthetic-technical maintenance service of mandibular implant-overdentures supported by CAD/CAM Ti-bar and soldered G-bar. Edentulous patients were consecutively admitted for implant-prosthodontic treatment with a maxillary complete denture and a mandibular implant-overdenture connected to a rigid G-bar or Ti-bar. Maintenance service and problems with the implant-retention device complex and the prosthesis were recorded during minimally 3-4 years. Annual peri-implant crestal bone level changes (ΔBIC) were radiographically assessed. Data of 213 edentulous patients (mean age 68 ± 10 years), who had received a total of 477 tapered implants, were available. Ti-bar and G-bar comprised 101 and 112 patients with 231 and 246 implants, respectively. Ti-bar mostly exhibited distal bar extensions (96%) compared to 34% of G-bar (p overdentures supported by soldered gold bars or milled CAD/CAM Ti-bars are a successful treatment modality but require regular maintenance service. These short-term observations support the hypothesis that CAD/CAM Ti-bars reduce technical complications. Fracture location indicated that the titanium thickness around the screw-access hole should be increased. © 2013 Wiley Periodicals, Inc.

  12. List of U.S. Army Research Institute Research and Technical Publications for Public Release/Unlimited Distribution. Fiscal Year 2007 (October 1, 2006 to September 30, 2007) With Author Index and Report Titles and Subject Terms Index

    Science.gov (United States)

    2008-04-01

    Year 2007 October 1, 2006 to September 30, 2007 With Author Index and Report Titles and Subject Terms Index United States Army Research Institute for...Fiscal Year 2007 October 1, 2006 to September 30, 2007 With Author Index and Report Titles and Subject Terms Index CONTENTS Page Introduction...39 Author Index .................................................................................................................. 39

  13. Survey of computed tomography doses in head and chest protocols

    International Nuclear Information System (INIS)

    Souza, Giordana Salvi de; Silva, Ana Maria Marques da

    2016-01-01

    Computed tomography is a clinical tool for the diagnosis of patients. However, the patient is subjected to a complex dose distribution. The aim of this study was to survey dose indicators in head and chest protocols CT scans, in terms of Dose-Length Product(DLP) and effective dose for adult and pediatric patients, comparing them with diagnostic reference levels in the literature. Patients were divided into age groups and the following image acquisition parameters were collected: age, kV, mAs, Volumetric Computed Tomography Dose Index (CTDIvol) and DLP. The effective dose was found multiplying DLP by correction factors. The results were obtained from the third quartile and showed the importance of determining kV and mAs values for each patient depending on the studied region, age and thickness. (author)

  14. 3rd International Conference on Computational Mathematics and Computational Geometry

    CERN Document Server

    Ravindran, Anton

    2016-01-01

    This volume presents original research contributed to the 3rd Annual International Conference on Computational Mathematics and Computational Geometry (CMCGS 2014), organized and administered by Global Science and Technology Forum (GSTF). Computational Mathematics and Computational Geometry are closely related subjects, but are often studied by separate communities and published in different venues. This volume is unique in its combination of these topics. After the conference, which took place in Singapore, selected contributions chosen for this volume and peer-reviewed. The section on Computational Mathematics contains papers that are concerned with developing new and efficient numerical algorithms for mathematical sciences or scientific computing. They also cover analysis of such algorithms to assess accuracy and reliability. The parts of this project that are related to Computational Geometry aim to develop effective and efficient algorithms for geometrical applications such as representation and computati...

  15. Computer Virus Bibliography, 1988-1989.

    Science.gov (United States)

    Bologna, Jack, Comp.

    This bibliography lists 14 books, 154 journal articles, 34 newspaper articles, and 3 research papers published during 1988-1989 on the subject of computer viruses, software protection and 'cures', virus hackers, and other related issues. Some of the sources listed include Computers and Security, Computer Security Digest, PC Week, Time, the New…

  16. Adolescents' Chunking of Computer Programs.

    Science.gov (United States)

    Magliaro, Susan; Burton, John K.

    To investigate what children learn during computer programming instruction, students attending a summer computer camp were asked to recall either single lines or chunks of computer programs from either coherent or scrambled programs. The 16 subjects, ages 12 to 17, were divided into three instructional groups: (1) beginners, who were taught to…

  17. Analysis of problem solving on project based learning with resource based learning approach computer-aided program

    Science.gov (United States)

    Kuncoro, K. S.; Junaedi, I.; Dwijanto

    2018-03-01

    This study aimed to reveal the effectiveness of Project Based Learning with Resource Based Learning approach computer-aided program and analyzed problem-solving abilities in terms of problem-solving steps based on Polya stages. The research method used was mixed method with sequential explanatory design. The subject of this research was the students of math semester 4. The results showed that the S-TPS (Strong Top Problem Solving) and W-TPS (Weak Top Problem Solving) had good problem-solving abilities in each problem-solving indicator. The problem-solving ability of S-MPS (Strong Middle Problem Solving) and (Weak Middle Problem Solving) in each indicator was good. The subject of S-BPS (Strong Bottom Problem Solving) had a difficulty in solving the problem with computer program, less precise in writing the final conclusion and could not reflect the problem-solving process using Polya’s step. While the Subject of W-BPS (Weak Bottom Problem Solving) had not been able to meet almost all the indicators of problem-solving. The subject of W-BPS could not precisely made the initial table of completion so that the completion phase with Polya’s step was constrained.

  18. Gender disparities in the association between epicardial adipose tissue volume and coronary atherosclerosis: A 3-dimensional cardiac computed tomography imaging study in Japanese subjects

    OpenAIRE

    Dagvasumberel Munkhbaatar; Shimabukuro Michio; Nishiuchi Takeshi; Ueno Junji; Takao Shoichiro; Fukuda Daiju; Hirata Yoichiro; Kurobe Hirotsugu; Soeki Takeshi; Iwase Takashi; Kusunose Kenya; Niki Toshiyuki; Yamaguchi Koji; Taketani Yoshio; Yagi Shusuke

    2012-01-01

    Abstract Background Growing evidence suggests that epicardial adipose tissue (EAT) may contribute to the development of coronary artery disease (CAD). In this study, we explored gender disparities in EAT volume (EATV) and its impact on coronary atherosclerosis. Methods The study population consisted of 90 consecutive subjects (age: 63 ± 12 years; men: 47, women: 43) who underwent 256-slice multi-detector computed tomography (MDCT) coronary angiography. EATV was measured as the sum of cross-se...

  19. Integrated computation model of lithium-ion battery subject to nail penetration

    International Nuclear Information System (INIS)

    Liu, Binghe; Yin, Sha; Xu, Jun

    2016-01-01

    Highlights: • A coupling model to predict battery penetration process is established. • Penetration test is designed and validates the computational model. • Governing factors of the penetration induced short-circuit is discussed. • Critical safety battery design guidance is suggested. - Abstract: The nail penetration of lithium-ion batteries (LIBs) has become a standard battery safety evaluation method to mimic the potential penetration of a foreign object into LIB, which can lead to internal short circuit with catastrophic consequences, such as thermal runaway, fire, and explosion. To provide a safe, time-efficient, and cost-effective method for studying the nail penetration problem, an integrated computational method that considers the mechanical, electrochemical, and thermal behaviors of the jellyroll was developed using a coupled 3D mechanical model, a 1D battery model, and a short circuit model. The integrated model, along with the sub-models, was validated to agree reasonably well with experimental test data. In addition, a comprehensive quantitative analysis of governing factors, e.g., shapes, sizes, and displacements of nails, states of charge, and penetration speeds, was conducted. The proposed computational framework for LIB nail penetration was first introduced. This framework can provide an accurate prediction of the time history profile of battery voltage, temperature, and mechanical behavior. The factors that affected the behavior of the jellyroll under nail penetration were discussed systematically. Results provide a solid foundation for future in-depth studies on LIB nail penetration mechanisms and safety design.

  20. Accommodation and convergence during sustained computer work.

    Science.gov (United States)

    Collier, Juanita D; Rosenfield, Mark

    2011-07-01

    With computer usage becoming almost universal in contemporary society, the reported prevalence of computer vision syndrome (CVS) is extremely high. However, the precise physiological mechanisms underlying CVS remain unclear. Although abnormal accommodation and vergence responses have been cited as being responsible for the symptoms produced, there is little objective evidence to support this claim. Accordingly, this study measured both of these oculomotor parameters during a sustained period of computer use. Subjects (N = 20) were required to read text aloud from a laptop computer at a viewing distance of 50 cm for a sustained 30-minute period through their habitual refractive correction. At 2-minute intervals, the accommodative response (AR) to the computer screen was measured objectively using a Grand Seiko WAM 5500 optometer (Grand Seiko, Hiroshima, Japan). Additionally, the vergence response was assessed by measuring the associated phoria (AP), i.e., prism to eliminate fixation disparity, using a customized fixation disparity target that appeared on the computer screen. Subjects were asked to rate the degree of difficulty of the reading task on a scale from 1 to 10. Mean accommodation and AP values during the task were 1.07 diopters and 0.74∆ base-in (BI), respectively. The mean discomfort score was 4.9. No significant changes in accommodation or vergence were observed during the course of the 30-minute test period. There was no significant difference in the AR as a function of subjective difficulty. However, the mean AP for the subjects who reported the least and greatest discomfort during the task was 1.55∆ BI and 0, respectively (P = 0.02). CVS, after 30 minutes was worse in subjects exhibiting zero fixation disparity when compared with those subjects having a BI AP but does not appear to be related to differences in accommodation. A slightly reduced vergence response increases subject comfort during the task. Copyright © 2011 American Optometric

  1. Computation as an Unbounded Process

    Czech Academy of Sciences Publication Activity Database

    van Leeuwen, J.; Wiedermann, Jiří

    2012-01-01

    Roč. 429, 20 April (2012), s. 202-212 ISSN 0304-3975 R&D Projects: GA ČR GAP202/10/1333 Institutional research plan: CEZ:AV0Z10300504 Keywords : arithmetical hierarchy * hypercomputation * mind change complexity * nondeterminism * relativistic computation * unbounded computation Subject RIV: IN - Informatics, Computer Science Impact factor: 0.489, year: 2012

  2. Cartoon computation: quantum-like computing without quantum mechanics

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2007-01-01

    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed-they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpretation and allows for a cartoon representation. (fast track communication)

  3. Short-term corneal changes with gas-permeable contact lens wear in keratoconus subjects: a comparison of two fitting approaches.

    Science.gov (United States)

    Romero-Jiménez, Miguel; Santodomingo-Rubido, Jacinto; Flores-Rodríguez, Patricia; González-Méijome, Jose-Manuel

    2015-01-01

    To evaluate changes in anterior corneal topography and higher-order aberrations (HOA) after 14-days of rigid gas-permeable (RGP) contact lens (CL) wear in keratoconus subjects comparing two different fitting approaches. Thirty-one keratoconus subjects (50 eyes) without previous history of CL wear were recruited for the study. Subjects were randomly fitted to either an apical-touch or three-point-touch fitting approach. The lens' back optic zone radius (BOZR) was 0.4mm and 0.1mm flatter than the first definite apical clearance lens, respectively. Differences between the baseline and post-CL wear for steepest, flattest and average corneal power (ACP) readings, central corneal astigmatism (CCA), maximum tangential curvature (KTag), anterior corneal surface asphericity, anterior corneal surface HOA and thinnest corneal thickness measured with Pentacam were compared. A statistically significant flattening was found over time on the flattest and steepest simulated keratometry and ACP in apical-touch group (all p<0.01). A statistically significant reduction in KTag was found in both groups after contact lens wear (all p<0.05). Significant reduction was found over time in CCA (p=0.001) and anterior corneal asphericity in both groups (p<0.001). Thickness at the thinnest corneal point increased significantly after CL wear (p<0.0001). Coma-like and total HOA root mean square (RMS) error were significantly reduced following CL wearing in both fitting approaches (all p<0.05). Short-term rigid gas-permeable CL wear flattens the anterior cornea, increases the thinnest corneal thickness and reduces anterior surface HOA in keratoconus subjects. Apical-touch was associated with greater corneal flattening in comparison to three-point-touch lens wear. Copyright © 2014 Spanish General Council of Optometry. Published by Elsevier Espana. All rights reserved.

  4. Transcending Library Catalogs: A Comparative Study of Controlled Terms in Library of Congress Subject Headings and User-Generated Tags in LibraryThing for Transgender Books

    Science.gov (United States)

    Adler, Melissa

    2009-01-01

    Perhaps the greatest power of folksonomies, especially when set against controlled vocabularies like the Library of Congress Subject Headings, lies in their capacity to empower user communities to name their own resources in their own terms. This article analyzes the potential and limitations of both folksonomies and controlled vocabularies for…

  5. Morphological analysis of the proximal femur by computed tomography in Japanese subjects

    International Nuclear Information System (INIS)

    Hagiwara, Masashi

    1995-01-01

    In order to evaluate the morphological features of the proximal femur in the Japanese, 100 femora of normal Japanese subjects (normal group) and 60 femora of 43 Japanese patients with secondary osteoarthrosis of the hip (OA group) were analyzed using CT images. The scans for the dried bones (normal group) were done at a setting of 80 kV and 20 mA, for 2 sec duration. The scans were reconstructed using the soft tissue algorithm built into the GE-9800 scanner. The patient scans (OA group) were done at 120 kV and 170 mA also for 2 sec duration, and reconstructed using the same bone algorithm. The results were as follows: Thinning of the femoral cortex occurred in normal females over 60 years of age. The canal flare index at the proximal part of the femoral diaphysis was negatively correlated with the canal diameter at the isthmus. The index at the upper part was greater than that at the lower part. The two groups showed no statistical difference in this index. In the metaphysis, the canal flare index at the anterior portion was twice that at the posterior portion. In absolute terms, the OA group had a reduced flare or curve along the medial portion. In cross-section, the canal shape of the diaphysis was more elliptical in the OA group than in the normal group. The longitudinal axis of the canal was directed more sagittally in the OA group than in the normal group. (author)

  6. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  7. Parapsychology and the neurosciences: a computer-based content analysis of abstracts in the database "MEDLINE" from 1975 to 1995.

    Science.gov (United States)

    Fassbender, P

    1997-04-01

    A computer-based content of 109 abstracts retrieved by the subject heading "parapsychology" from the database MEDLINE for the years 1975-1995 is presented. Data were analyzed by four categories to terms denoting (1) research methods, (2) neurosciences, (3) humanities/psychodynamics, and (4) parapsychology. Results indicated a growing interest in neuroscientific and neuropsychological explanations and theories.

  8. Design of a modular digital computer system, DRL 4. [for meeting future requirements of spaceborne computers

    Science.gov (United States)

    1972-01-01

    The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.

  9. Prediction of subjective ratings of emotional pictures by EEG features

    Science.gov (United States)

    McFarland, Dennis J.; Parvaz, Muhammad A.; Sarnacki, William A.; Goldstein, Rita Z.; Wolpaw, Jonathan R.

    2017-02-01

    Objective. Emotion dysregulation is an important aspect of many psychiatric disorders. Brain-computer interface (BCI) technology could be a powerful new approach to facilitating therapeutic self-regulation of emotions. One possible BCI method would be to provide stimulus-specific feedback based on subject-specific electroencephalographic (EEG) responses to emotion-eliciting stimuli. Approach. To assess the feasibility of this approach, we studied the relationships between emotional valence/arousal and three EEG features: amplitude of alpha activity over frontal cortex; amplitude of theta activity over frontal midline cortex; and the late positive potential over central and posterior mid-line areas. For each feature, we evaluated its ability to predict emotional valence/arousal on both an individual and a group basis. Twenty healthy participants (9 men, 11 women; ages 22-68) rated each of 192 pictures from the IAPS collection in terms of valence and arousal twice (96 pictures on each of 4 d over 2 weeks). EEG was collected simultaneously and used to develop models based on canonical correlation to predict subject-specific single-trial ratings. Separate models were evaluated for the three EEG features: frontal alpha activity; frontal midline theta; and the late positive potential. In each case, these features were used to simultaneously predict both the normed ratings and the subject-specific ratings. Main results. Models using each of the three EEG features with data from individual subjects were generally successful at predicting subjective ratings on training data, but generalization to test data was less successful. Sparse models performed better than models without regularization. Significance. The results suggest that the frontal midline theta is a better candidate than frontal alpha activity or the late positive potential for use in a BCI-based paradigm designed to modify emotional reactions.

  10. [Long-term outcome analysis of subjective and objective parameters after breast reduction in 159 cases: Patients judge differently from plastic surgeons].

    Science.gov (United States)

    Osinga, Rik; Babst, Doris; Bodmer, Elvira S; Link, Bjoern C; Fritsche, Elmar; Hug, Urs

    2017-12-01

    This work assessed both subjective and objective postoperative parameters after breast reduction surgery and compared between patients and plastic surgeons. After an average postoperative observation period of 6.7 ± 2.7 (2 - 13) years, 159 out of 259 patients (61 %) were examined. The mean age at the time of surgery was 37 ± 14 (15 - 74) years. The postoperative anatomy of the breast and other anthropometric parameters were measured in cm with the patient in an upright position. The visual analogue scale (VAS) values for symmetry, size, shape, type of scar and overall satisfaction both from the patient's and from four plastic surgeons' perspectives were assessed and compared. Patients rated the postoperative result significantly better than surgeons. Good subjective ratings by patients for shape, symmetry and sensitivity correlated with high scores for overall assessment. Shape had the strongest influence on overall satisfaction (regression coefficient 0.357; p reduction surgery, long-term outcome is rated significantly better by patients than by plastic surgeons. Good subjective ratings by patients for shape, symmetry and sensitivity correlated with high scores for overall assessment. Shape had the strongest influence on overall satisfaction, followed by symmetry and sensitivity of the breast. Postoperative size of the breast, resection weight, type of scar, age or BMI was not of significant influence. Symmetry was the only assessed subjective parameter of this study that could be objectified by postoperative measurements. Georg Thieme Verlag KG Stuttgart · New York.

  11. TEACHERS’ COMPUTER SELF-EFFICACY AND THEIR USE OF EDUCATIONAL TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vehbi TUREL

    2014-10-01

    Full Text Available This study examined the use of educational technology by primary and subject teachers (i.e. secondary and high school teachers in a small town in the eastern part of Turkey in the spring of 2012. The study examined the primary, secondary and high school teachers’ Ø personal and computer related (demographic characteristics, Ø their computer self-efficacy perceptions, Ø their computer-using level in certain software, Ø their frequency of computer use for teaching, administrative and communication objectives, and Ø their use of educational technology preferences for preparation and teaching purposes. In this study, all primary, secondary and high school teachers in the small town were given the questionnaires to complete. 158 teachers (n=158 completed and returned them. The study was mostly quantitative and partly qualitative. The quantitative results were analysed with SPSS (i.e. mean, Std. Deviation, frequency, percentage, ANOVA. The qualitative data were analysed with examining the participants’ responses gathered from the open-ended questions and focussing on the shared themes among the responses. The results reveal that the teachers think that they have good computer self-efficacy perceptions, their level in certain programs is good, and they often use computers for a wide range of purposes. There are also statistical differences between; Ø their computer self-efficacy perceptions, Ø frequency of computer use for certain purposes, and Ø computer level in certain programs in terms of different independent variables.

  12. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  13. Validation of a computer modelled forensic facial reconstruction technique using CT data from live subjects: a pilot study.

    Science.gov (United States)

    Short, Laura J; Khambay, Balvinder; Ayoub, Ashraf; Erolin, Caroline; Rynn, Chris; Wilkinson, Caroline

    2014-04-01

    Human forensic facial soft tissue reconstructions are used when post-mortem deterioration makes identification difficult by usual means. The aim is to trigger recognition of the in vivo countenance of the individual by a friend or family member. A further use is in the field of archaeology. There are a number of different methods that can be applied to complete the facial reconstruction, ranging from two dimensional drawings, three dimensional clay models and now, with the advances of three dimensional technology, three dimensional computerised modelling. Studies carried out to assess the accuracy of facial reconstructions have produced variable results over the years. Advances in three dimensional imaging techniques in the field of oral and maxillofacial surgery, particularly cone beam computed tomography (CBCT), now provides an opportunity to utilise the data of live subjects and assess the accuracy of the three dimensional computerised facial reconstruction technique. The aim of this study was to assess the accuracy of a computer modelled facial reconstruction technique using CBCT data from live subjects. This retrospective pilot study was carried out at the Glasgow Dental Hospital Orthodontic Department and the Centre of Anatomy and Human Identification, Dundee University School of Life Sciences. Ten patients (5 male and 5 female; mean age 23 years) with mild skeletal discrepancies with pre-surgical cone beam CT data (CBCT) were included in this study. The actual and forensic reconstruction soft tissues were analysed using 3D software to look at differences between landmarks, linear and angular measurements and surface meshes. There were no statistical differences for 18 out of the 23 linear and 7 out of 8 angular measurements between the reconstruction and the target (p<0.05). The use of Procrustes superimposition has highlighted potential problems with soft tissue depth and anatomical landmarks' position. Surface mesh analysis showed that this virtual

  14. Effect of sibutramine on cardiovascular outcomes in overweight and obese subjects

    DEFF Research Database (Denmark)

    James, W Philip T; Caterson, Ian D; Coutinho, Walmir

    2010-01-01

    The long-term effects of sibutramine treatment on the rates of cardiovascular events and cardiovascular death among subjects at high cardiovascular risk have not been established.......The long-term effects of sibutramine treatment on the rates of cardiovascular events and cardiovascular death among subjects at high cardiovascular risk have not been established....

  15. Cone-Beam Computed Tomography Analysis of the Nasopharyngeal Airway in Nonsyndromic Cleft Lip and Palate Subjects.

    Science.gov (United States)

    Al-Fahdawi, Mahmood Abd; Farid, Mary Medhat; El-Fotouh, Mona Abou; El-Kassaby, Marwa Abdelwahab

    2017-03-01

      To assess the nasopharyngeal airway volume, cross-sectional area, and depth in previously repaired nonsyndromic unilateral cleft lip and palate versus bilateral cleft lip and palate patients compared with noncleft controls using cone-beam computed tomography with the ultimate goal of finding whether cleft lip and palate patients are more liable to nasopharyngeal airway obstruction.   A retrospective analysis comparing bilateral cleft lip and palate, unilateral cleft lip and palate, and control subjects. Significance at P ≤ .05.   Cleft Care Center and the outpatient clinic that are both affiliated with our faculty.   Cone-beam computed tomography data were selected of 58 individuals aged 9 to 12 years: 14 with bilateral cleft lip and palate and 20 with unilateral cleft lip and palate as well as 24 age- and gender-matched noncleft controls.   Volume, depth, and cross-sectional area of nasopharyngeal airway were measured.   Patients with bilateral cleft lip and palate showed significantly larger nasopharyngeal airway volume than controls and patients with unilateral cleft lip and palate (P cleft lip and palate showed significantly larger cross-sectional area than those with unilateral cleft lip and palate (P .05). Patients with bilateral cleft lip and palate showed significantly larger depth than controls and those with unilateral cleft lip and palate (P cleft lip and palate showed insignificant nasopharyngeal airway volume, cross-sectional area, and depth compared with controls (P > .05).   Unilateral and bilateral cleft lip and palate patients did not show significantly less volume, cross-sectional area, or depth of nasopharyngeal airway than controls. From the results of this study we conclude that unilateral and bilateral cleft lip and palate patients at the studied age and stage of repaired clefts are not more prone to nasopharyngeal airway obstruction than controls.

  16. Evaluation of selected trace metals in some hypertensive subjects in ...

    African Journals Online (AJOL)

    Both patients and control subjects were classified based on the Seventh Report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure (JNC 7). The weight, height and blood pressure of all subjects were measured and their body mass indices (BMI) computed. The mean ...

  17. Seeing the Wood for the Trees: Enhancing Metadata Subject Elements with Weights

    Directory of Open Access Journals (Sweden)

    Hong Zhang

    2011-06-01

    Full Text Available Subject indexing has been conducted in a dichotomous way in terms of what the information object is primarily about/of or not, corresponding to the presence or absence of a particular subject term, respectively. With more subject terms brought into information systems via social tagging, manual cataloging, or automated indexing, many more partially relevant results can be retrieved. Using examples from digital image collections and online library catalog systems, we explore the problem and advocate for adding a weighting mechanism to subject indexing and tagging to make web search and navigation more effective and efficient. We argue that the weighting of subject terms is more important than ever in today’s world of growing collections, more federated searching, and expansion of social tagging. Such a weighting mechanism needs to be considered and applied not only by indexers, catalogers, and taggers, but also needs to be incorporated into system functionality and metadata schemas.

  18. Heterogeneous compute in computer vision: OpenCL in OpenCV

    Science.gov (United States)

    Gasparakis, Harris

    2014-02-01

    We explore the relevance of Heterogeneous System Architecture (HSA) in Computer Vision, both as a long term vision, and as a near term emerging reality via the recently ratified OpenCL 2.0 Khronos standard. After a brief review of OpenCL 1.2 and 2.0, including HSA features such as Shared Virtual Memory (SVM) and platform atomics, we identify what genres of Computer Vision workloads stand to benefit by leveraging those features, and we suggest a new mental framework that replaces GPU compute with hybrid HSA APU compute. As a case in point, we discuss, in some detail, popular object recognition algorithms (part-based models), emphasizing the interplay and concurrent collaboration between the GPU and CPU. We conclude by describing how OpenCL has been incorporated in OpenCV, a popular open source computer vision library, emphasizing recent work on the Transparent API, to appear in OpenCV 3.0, which unifies the native CPU and OpenCL execution paths under a single API, allowing the same code to execute either on CPU or on a OpenCL enabled device, without even recompiling.

  19. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  20. Long-Term Monitoring of Physical Behavior Reveals Different Cardiac Responses to Physical Activity among Subjects with and without Chronic Neck Pain

    Science.gov (United States)

    Hallman, David M.; Mathiassen, Svend Erik; Lyskov, Eugene

    2015-01-01

    Background. We determined the extent to which heart rate variability (HRV) responses to daily physical activity differ between subjects with and without chronic neck pain. Method. Twenty-nine subjects (13 women) with chronic neck pain and 27 age- and gender-matched healthy controls participated. Physical activity (accelerometry), HRV (heart rate monitor), and spatial location (Global Positioning System (GPS)) were recorded for 74 hours. GPS data were combined with a diary to identify periods of work and of leisure at home and elsewhere. Time- and frequency-domain HRV indices were calculated and stratified by period and activity type (lying/sitting, standing, or walking). ANCOVAs with multiple adjustments were used to disclose possible group differences in HRV. Results. The pain group showed a reduced HRV response to physical activity compared with controls (p = .001), according to the sympathetic-baroreceptor HRV index (LF/HF, ratio between low- and high-frequency power), even after adjustment for leisure time physical activity, work stress, sleep quality, mental health, and aerobic capacity (p = .02). The parasympathetic response to physical activity did not differ between groups. Conclusions. Relying on long-term monitoring of physical behavior and heart rate variability, we found an aberrant sympathetic-baroreceptor response to daily physical activity among subjects with chronic neck pain. PMID:26557711

  1. Long-Term Monitoring of Physical Behavior Reveals Different Cardiac Responses to Physical Activity among Subjects with and without Chronic Neck Pain

    Directory of Open Access Journals (Sweden)

    David M. Hallman

    2015-01-01

    Full Text Available Background. We determined the extent to which heart rate variability (HRV responses to daily physical activity differ between subjects with and without chronic neck pain. Method. Twenty-nine subjects (13 women with chronic neck pain and 27 age- and gender-matched healthy controls participated. Physical activity (accelerometry, HRV (heart rate monitor, and spatial location (Global Positioning System (GPS were recorded for 74 hours. GPS data were combined with a diary to identify periods of work and of leisure at home and elsewhere. Time- and frequency-domain HRV indices were calculated and stratified by period and activity type (lying/sitting, standing, or walking. ANCOVAs with multiple adjustments were used to disclose possible group differences in HRV. Results. The pain group showed a reduced HRV response to physical activity compared with controls (p=.001, according to the sympathetic-baroreceptor HRV index (LF/HF, ratio between low- and high-frequency power, even after adjustment for leisure time physical activity, work stress, sleep quality, mental health, and aerobic capacity (p=.02. The parasympathetic response to physical activity did not differ between groups. Conclusions. Relying on long-term monitoring of physical behavior and heart rate variability, we found an aberrant sympathetic-baroreceptor response to daily physical activity among subjects with chronic neck pain.

  2. HIGHER EDUCATIONAL INSTITUTION AS A SUBJECT OF ADAPTATION OF RURAL STUDENTS TO THE TERMS OF THE CITY

    Directory of Open Access Journals (Sweden)

    Alyona Aleksandrovna Antipova

    2014-11-01

    Full Text Available The article is devoted to the difficulties of adaptation of rural students to the various spheres of life of the modern city. These difficulties are considered as a field of activity of higher educational institution, acting as the subject of adaptation of students coming to study from rural areas to the terms of the city. The authors ' point of view on this issue is substantiated by the analysis of data of several sociological surveys conducted in various regions of theRussian Federation. Also the experience of assistance in adaptation of the Mordovia state University named after N. P. Ogarev of the city ofSaransk, which is the largest in the Republic of Mordovia University and which accommodates a large number of rural youth. The relevance and scientific novelty of research consists in allocation of areas of adaptation support of students from rural areas by the higher educational institution.

  3. Long-term Risedronate Treatment Normalizes Mineralization and Continues to Preserve Trabecular Architecture: Sequential Triple Biopsy Studies with Micro-Computed Tomography

    International Nuclear Information System (INIS)

    Borah, B.; Dufresne, T.; Ritman, E.; Jorgensen, S.; Liu, S.; Chmielewski, P.; Phipps, R.; Zhou, X.; Sibonga, J.; Turner, R.

    2006-01-01

    The objective of the study was to assess the time course of changes in bone mineralization and architecture using sequential triple biopsies from women with postmenopausal osteoporosis (PMO) who received long-term treatment with risedronate. Transiliac biopsies were obtained from the same subjects (n = 7) at baseline and after 3 and 5 years of treatment with 5 mg daily risedronate. Mineralization was measured using 3-dimensional (3D) micro-computed tomography (CT) with synchrotron radiation and was compared to levels in healthy premenopausal women (n = 12). Compared to the untreated PMO women at baseline, the premenopausal women had higher average mineralization (Avg-MIN) and peak mineralization (Peak-MIN) by 5.8% (P = 0.003) and 8.0% (P = 0.003), respectively, and lower ratio of low to high-mineralized bone volume (BMR-V) and surface area (BMR-S) by 73.3% (P = 0.005) and 61.7% (P 0.003), respectively. Relative to baseline, 3 years of risedronate treatment significantly increased Avg-MIN (4.9 ± 1.1%, P = 0.016) and Peak-MIN (6.2 ± 1.5%, P = 0.016), and significantly decreased BMR-V (-68.4 ± 7.3%, P = 0.016) and BMR-S (-50.2 ± 5.7%, P = 0.016) in the PMO women. The changes were maintained at the same level when treatment was continued up to 5 years. These results are consistent with the significant reduction of turnover observed after 3 years of treatment and which was similarly maintained through 5 years of treatment. Risedronate restored the degree of mineralization and the ratios of low- to high-mineralized bone to premenopausal levels after 3 years of treatment, suggesting that treatment reduced bone turnover in PMO women to healthy premenopausal levels. Conventional micro-CT analysis further demonstrated that bone volume (BV/TV) and trabecular architecture did not change from baseline up to 5 years of treatment, suggesting that risedronate provided long-term preservation of trabecular architecture in the PMO women. Overall, risedronate provided sustained

  4. Computational Intelligence Agent-Oriented Modelling

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2006-01-01

    Roč. 5, č. 2 (2006), s. 430-433 ISSN 1109-2777 R&D Projects: GA MŠk 1M0567 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * adaptive agents * computational intelligence Subject RIV: IN - Informatics, Computer Science

  5. The Problem of Subject Access to Visual Materials

    Directory of Open Access Journals (Sweden)

    Heather P. Jespersen

    2004-09-01

    Full Text Available This article discusses the problem of giving subject access to works of art. We survey both concept-based and content-based access by computers and by indexers/catalogers respectively, as well as issues of interoperability, database and indexer consistency, and cataloging standards. The authors, both of whom are trained art historians, question attempts to mystify fine art subject matter by the creation of clever library science systems that are executed by the naive. Only when trained art historians and knowledgeable catalogers are finally responsible for providing subject access to works of art, will true interoperability and consistency happen.

  6. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    Science.gov (United States)

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  7. The Long-Term Consequences of Relationship Formation for Subjective Well-Being

    Science.gov (United States)

    Soons, Judith P. M.; Liefbroer, Aart C.; Kalmijn, Matthijs

    2009-01-01

    This study examines how relationship transitions affect subjective well-being (SWB) and how this effect changes over time. We used prospective data containing information about 18 years of young adults' lives (PSIN, N = 5, 514). SWB was measured with the Satisfaction with Life Scale. Within-person multilevel regression analyses showed that dating,…

  8. Youth Homelessness and Individualised Subjectivity

    Science.gov (United States)

    Farrugia, David

    2011-01-01

    This article aims to contribute to understandings of youth homelessness and subjectivity by analysing identity construction in terms of young people's negotiation of the structural and institutional environment of youth homelessness. I suggest that while existing literature on this topic concentrates mainly on micro-social encounters, the…

  9. Pattern of Ocular Diseases among Computer users in Enugu, Nigeria

    African Journals Online (AJOL)

    7 subjects (1.3%) had monocular blindness with VA<3/60. 37 (3.3%) subjects had low vision with VA < 6/18-3/60. Conclusion: Most of the subjects were young people. Ocular disorders were encountered in computer users. Ocular health status of computer users can be improved through periodic ocular examination and ...

  10. 1995 CERN school of computing. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Vandoni, C E [ed.

    1995-10-25

    These proceedings contain a written account of the majority of the lectures given at the 1995 CERN School of Computing. The Scientific Programme was articulated on 8 main themes: Human Computer Interfaces; Collaborative Software Engineering; Information Super Highways; Trends in Computer Architecture/Industry; Parallel Architectures (MPP); Mathematical Computing; Data Acquisition Systems; World-Wide Web for Physics. A number of lectures dealt with general aspects of computing, in particular in the area of Human Computer Interfaces (computer graphics, user interface tools and virtual reality). Applications in HEP of computer graphics (event display) was the subject of two lectures. The main theme of Mathematical Computing covered Mathematica and the usage of statistics packages. The important subject of Data Acqusition Systems was covered by lectures on switching techniques and simulation and modelling tools. A series of lectures dealt with the Information Super Highways and World-Wide Web Technology and its applications to High Energy Physics. Different aspects of Object Oriented Information Engineering Methodology and Object Oriented Programming in HEP were dealt in detail also in connection with data acquisition systems. On the theme `Trends in Computer Architecutre and Industry` lectures were given on: ATM Switching, and FORTRAN90 and High Performance FORTRAN. Computer Parallel Architectures (MPP) lectures delt with very large scale open systems, history and future of computer system architecture, message passing paradigm, features of PVM and MPI. (orig.).

  11. 1995 CERN school of computing. Proceedings

    International Nuclear Information System (INIS)

    Vandoni, C.E.

    1995-01-01

    These proceedings contain a written account of the majority of the lectures given at the 1995 CERN School of Computing. The Scientific Programme was articulated on 8 main themes: Human Computer Interfaces; Collaborative Software Engineering; Information Super Highways; Trends in Computer Architecture/Industry; Parallel Architectures (MPP); Mathematical Computing; Data Acquisition Systems; World-Wide Web for Physics. A number of lectures dealt with general aspects of computing, in particular in the area of Human Computer Interfaces (computer graphics, user interface tools and virtual reality). Applications in HEP of computer graphics (event display) was the subject of two lectures. The main theme of Mathematical Computing covered Mathematica and the usage of statistics packages. The important subject of Data Acqusition Systems was covered by lectures on switching techniques and simulation and modelling tools. A series of lectures dealt with the Information Super Highways and World-Wide Web Technology and its applications to High Energy Physics. Different aspects of Object Oriented Information Engineering Methodology and Object Oriented Programming in HEP were dealt in detail also in connection with data acquisition systems. On the theme 'Trends in Computer Architecutre and Industry' lectures were given on: ATM Switching, and FORTRAN90 and High Performance FORTRAN. Computer Parallel Architectures (MPP) lectures delt with very large scale open systems, history and future of computer system architecture, message passing paradigm, features of PVM and MPI. (orig.)

  12. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Directory of Open Access Journals (Sweden)

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  13. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  14. The computational form of craving is a selective multiplication of economic value.

    Science.gov (United States)

    Konova, Anna B; Louie, Kenway; Glimcher, Paul W

    2018-04-17

    Craving is thought to be a specific desire state that biases choice toward the desired object, be it chocolate or drugs. A vast majority of people report having experienced craving of some kind. In its pathological form craving contributes to health outcomes in addiction and obesity. Yet despite its ubiquity and clinical relevance we still lack a basic neurocomputational understanding of craving. Here, using an instantaneous measure of subjective valuation and selective cue exposure, we identify a behavioral signature of a food craving-like state and advance a computational framework for understanding how this state might transform valuation to bias choice. We find desire induced by exposure to a specific high-calorie, high-fat/sugar snack good is expressed in subjects' momentary willingness to pay for this good. This effect is selective but not exclusive to the exposed good; rather, we find it generalizes to nonexposed goods in proportion to their subjective attribute similarity to the exposed ones. A second manipulation of reward size (number of snack units available for purchase) further suggested that a multiplicative gain mechanism supports the transformation of valuation during laboratory craving. These findings help explain how real-world food craving can result in behaviors inconsistent with preferences expressed in the absence of craving and open a path for the computational modeling of craving-like phenomena using a simple and repeatable experimental tool for assessing subjective states in economic terms. Copyright © 2018 the Author(s). Published by PNAS.

  15. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  16. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Computing for particle physics. Report of the HEPAP subpanel on computer needs for the next decade

    International Nuclear Information System (INIS)

    1985-08-01

    The increasing importance of computation to the future progress in high energy physics is documented. Experimental computing demands are analyzed for the near future (four to ten years). The computer industry's plans for the near term and long term are surveyed as they relate to the solution of high energy physics computing problems. This survey includes large processors and the future role of alternatives to commercial mainframes. The needs for low speed and high speed networking are assessed, and the need for an integrated network for high energy physics is evaluated. Software requirements are analyzed. The role to be played by multiple processor systems is examined. The computing needs associated with elementary particle theory are briefly summarized. Computing needs associated with the Superconducting Super Collider are analyzed. Recommendations are offered for expanding computing capabilities in high energy physics and for networking between the laboratories

  18. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  19. Computer Programming Education with Miranda

    NARCIS (Netherlands)

    Joosten, S.M.M.; van den Berg, Klaas

    During the past four years, an experiment has been carried out with an introductory course in computer programming, based on functional programming. This article describes the background of this approach, the aim of the computer programming course, the outline and subject matter of the course parts

  20. The principles of computer hardware

    CERN Document Server

    Clements, Alan

    2000-01-01

    Principles of Computer Hardware, now in its third edition, provides a first course in computer architecture or computer organization for undergraduates. The book covers the core topics of such a course, including Boolean algebra and logic design; number bases and binary arithmetic; the CPU; assembly language; memory systems; and input/output methods and devices. It then goes on to cover the related topics of computer peripherals such as printers; the hardware aspects of the operating system; and data communications, and hence provides a broader overview of the subject. Its readable, tutorial-based approach makes it an accessible introduction to the subject. The book has extensive in-depth coverage of two microprocessors, one of which (the 68000) is widely used in education. All chapters in the new edition have been updated. Major updates include: powerful software simulations of digital systems to accompany the chapters on digital design; a tutorial-based introduction to assembly language, including many exam...

  1. Text and Subject Position after Althusser

    Directory of Open Access Journals (Sweden)

    Antony Easthope

    1994-01-01

    Full Text Available Althusser's achievement is that he redefined Marxism. He reconceptualizes history and totality in terms of different times, construes knowledge as the outcome of a process of construction, and interprets subjectivity as an effect of ideology and unconscious processes. Unfortunately, Althusser's functionalist view of ideology claims that the subject recognizes itself as a subject because it duplicates— reflects—an absolute subject. However, Lacan's notion of the mirror stage remedies this fault. Lacan's subject always misrecognizes itself in a process of contradiction that threatens the stability of any given social order. Moreover, unlike Foucault's subject, which is limited in that subjectivity is folded back into a vaguely expanded notion of "power," this revised Althusserian subject allows careful reading of texts. The critic does not simply read against the grain; he or she exposes the multiple points of identification offered the reader. For example, Wordsworth's "The Solitary Reaper" installs the reader in multiple positions: a devotee of high culture and the national canon, a lover of the verbal signifier and its play, a consumer of confessional discourse, and a masculine "I" desiring a laboring, singing woman.

  2. Average bit error probability of binary coherent signaling over generalized fading channels subject to additive generalized gaussian noise

    KAUST Repository

    Soury, Hamza

    2012-06-01

    This letter considers the average bit error probability of binary coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closed form expression in terms of the Fox\\'s H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading and Nakagami-m fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters. © 2012 IEEE.

  3. Non-Causal Computation

    Directory of Open Access Journals (Sweden)

    Ämin Baumeler

    2017-07-01

    Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.

  4. Abstract quantum computing machines and quantum computational logics

    Science.gov (United States)

    Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto

    2016-06-01

    Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.

  5. The effect of subject measurement error on joint kinematics in the conventional gait model: Insights from the open-source pyCGM tool using high performance computing methods.

    Science.gov (United States)

    Schwartz, Mathew; Dixon, Philippe C

    2018-01-01

    The conventional gait model (CGM) is a widely used biomechanical model which has been validated over many years. The CGM relies on retro-reflective markers placed along anatomical landmarks, a static calibration pose, and subject measurements as inputs for joint angle calculations. While past literature has shown the possible errors caused by improper marker placement, studies on the effects of inaccurate subject measurements are lacking. Moreover, as many laboratories rely on the commercial version of the CGM, released as the Plug-in Gait (Vicon Motion Systems Ltd, Oxford, UK), integrating improvements into the CGM code is not easily accomplished. This paper introduces a Python implementation for the CGM, referred to as pyCGM, which is an open-source, easily modifiable, cross platform, and high performance computational implementation. The aims of pyCGM are to (1) reproduce joint kinematic outputs from the Vicon CGM and (2) be implemented in a parallel approach to allow integration on a high performance computer. The aims of this paper are to (1) demonstrate that pyCGM can systematically and efficiently examine the effect of subject measurements on joint angles and (2) be updated to include new calculation methods suggested in the literature. The results show that the calculated joint angles from pyCGM agree with Vicon CGM outputs, with a maximum lower body joint angle difference of less than 10-5 degrees. Through the hierarchical system, the ankle joint is the most vulnerable to subject measurement error. Leg length has the greatest effect on all joints as a percentage of measurement error. When compared to the errors previously found through inter-laboratory measurements, the impact of subject measurements is minimal, and researchers should rather focus on marker placement. Finally, we showed that code modifications can be performed to include improved hip, knee, and ankle joint centre estimations suggested in the existing literature. The pyCGM code is provided

  6. Personal values, subjective well-being and destination-loyalty intention of international students.

    Science.gov (United States)

    Jamaludin, N L; Sam, D L; Sandal, G M; Adam, A A

    2016-01-01

    What are the factors that predict international students' destination-loyalty intention? This is the main question this paper addresses, using an online survey among 396 (short-term, N = 182) and (long-term, N = 214) international students at a Norwegian university. Structural equation model-AMOS was conducted to examine relationships among personal values, subjective well-being and destination-loyalty intentions. The results showed that: (1) universalism was positively related to subjective well-being for short-term students; and (2) subjective well-being was positively related to destination-loyalty intention for all groups. We found that relatively stable and happy individuals might be important for ensuring destination-loyalty intentions. Results also indicated that personal values that emphasize justice and equity are also important for short-term international students' well-being.

  7. Experiential Learning of Electronics Subject Matter in Middle School Robotics Courses

    Science.gov (United States)

    Rihtaršic, David; Avsec, Stanislav; Kocijancic, Slavko

    2016-01-01

    The purpose of this paper is to investigate whether the experiential learning of electronics subject matter is effective in the middle school open learning of robotics. Electronics is often ignored in robotics courses. Since robotics courses are typically comprised of computer-related subjects, and mechanical and electrical engineering, these…

  8. The Measurement of Relevance Amount of Documents That By Using of Google cross-language retrieval About Agriculture Subject Area are Retrieved

    Directory of Open Access Journals (Sweden)

    Fatemeh Jamshidi Ghahfarokhi

    2014-02-01

    Full Text Available In this study, the relevance amount of documents has been investigated by using google cross-language retrieval tools about a agriculture subject area in cross-language retrieval form, are retrieved. For this purpose, by using Persian journals articles that have had English abstracts, Persian phrases and subject terms with their English equivalent were extracted. In three class us, thirty number of phrases and subject terms of agriculture area were extracted: First class, subject phrases that only in agriculture are used; Secondary, agriculture subject terms that in other fields are used too; Third class, agriculture subject terms that out of this field are considered as public term. Then by these phrases and terms, documents were searched, and relevance amount of search results are investigated. Results of study showed that google cross-language retrieval tools for two classes of phrases and terms, in cross-language retrieval of relevance document about agriculture subject area, aren`t succeed: one class, agriculture subject terms that in other fields are used too. other class, agriculture subject terms that out of agriculture field are considered as public term. Google cross-language retrieval tools about subject phrase and terms that only in agriculture field are used, are performance rather desirable than other two class of phrase and terms

  9. Circulating sclerostin is elevated in short-term and reduced in long-term SCI.

    Science.gov (United States)

    Battaglino, Ricardo A; Sudhakar, Supreetha; Lazzari, Antonio A; Garshick, Eric; Zafonte, Ross; Morse, Leslie R

    2012-09-01

    Spinal cord injury (SCI) causes profound bone loss due to muscle paralysis resulting in the inability to walk. Sclerostin, a Wnt signaling pathway antagonist produced by osteocytes, is a potent inhibitor of bone formation. Short-term studies in rodent models have demonstrated increased sclerostin in response to mechanical unloading that is reversed with reloading. Although sclerostin inhibition has been proposed as a potential therapy for bone loss, it is not known if sclerostin levels vary with duration of SCI in humans. We analyzed circulating sclerostin in 155 men with varying degrees of SCI who were 1 year or more post-injury. We report that sclerostin levels are greatest in subjects with short-term SCI (≤5 years post-injury) and decrease significantly over the first 5 years post-injury. There was no association between sclerostin and injury duration in subjects with long-term SCI (>5 years post-injury). In subjects with long-term SCI, sclerostin levels were positively associated with lower extremity bone density and bone mineral content. These data suggest that sclerostin levels are initially increased after SCI in response to mechanical unloading. This response is time-limited and as bone loss progresses, circulating sclerostin is lowest in subjects with severe osteoporosis. These findings support a dual role for sclerostin after SCI: a therapeutic target in acute SCI, and a biomarker of osteoporosis severity in chronic SCI. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  11. Social Play at the Computer: Preschoolers Scaffold and Support Peers' Computer Competence.

    Science.gov (United States)

    Freeman, Nancy K.; Somerindyke, Jennifer

    2001-01-01

    Describes preschoolers' collaboration during free play in a computer lab, focusing on the computer's contribution to active, peer-mediated learning. Discusses these observations in terms of Parten's insights on children's social play and Vygotsky's socio-cultural learning theory, noting that the children scaffolded each other's growing computer…

  12. Computer-based visual communication in aphasia.

    Science.gov (United States)

    Steele, R D; Weinrich, M; Wertz, R T; Kleczewska, M K; Carlson, G S

    1989-01-01

    The authors describe their recently developed Computer-aided VIsual Communication (C-VIC) system, and report results of single-subject experimental designs probing its use with five chronic, severely impaired aphasic individuals. Studies replicate earlier results obtained with a non-computerized system, demonstrate patient competence with the computer implementation, extend the system's utility, and identify promising areas of application. Results of the single-subject experimental designs clarify patients' learning, generalization, and retention patterns, and highlight areas of performance difficulties. Future directions for the project are indicated.

  13. Tiagabine improves hippocampal long-term depression in rat pups subjected to prenatal inflammation.

    Directory of Open Access Journals (Sweden)

    Aline Rideau Batista Novais

    Full Text Available Maternal inflammation during pregnancy is associated with the later development of cognitive and behavioral impairment in the offspring, reminiscent of the traits of schizophrenia or autism spectrum disorders. Hippocampal long-term potentiation and long-term depression of glutamatergic synapses are respectively involved in memory formation and consolidation. In male rats, maternal inflammation with lipopolysaccharide (LPS led to a premature loss of long-term depression, occurring between 12 and 25 postnatal days instead of after the first postnatal month, and aberrant occurrence of long-term potentiation. We hypothesized this would be related to GABAergic system impairment. Sprague Dawley rats received either LPS or isotonic saline ip on gestational day 19. Male offspring's hippocampus was studied between 12 and 25 postnatal days. Morphological and functional analyses demonstrated that prenatal LPS triggered a deficit of hippocampal GABAergic interneurons, associated with presynaptic GABAergic transmission deficiency in male offspring. Increasing ambient GABA by impairing GABA reuptake with tiagabine did not interact with the low frequency-induced long-term depression in control animals but fully prevented its impairment in male offspring of LPS-challenged dams. Tiagabine furthermore prevented the aberrant occurrence of paired-pulse triggered long-term potentiation in these rats. Deficiency in GABA seems to be central to the dysregulation of synaptic plasticity observed in juvenile in utero LPS-challenged rats. Modulating GABAergic tone may be a possible therapeutic strategy at this developmental stage.

  14. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  15. Memory and selective attention in multiple sclerosis: cross-sectional computer-based assessment in a large outpatient sample.

    Science.gov (United States)

    Adler, Georg; Lembach, Yvonne

    2015-08-01

    Cognitive impairments may have a severe impact on everyday functioning and quality of life of patients with multiple sclerosis (MS). However, there are some methodological problems in the assessment and only a few studies allow a representative estimate of the prevalence and severity of cognitive impairments in MS patients. We applied a computer-based method, the memory and attention test (MAT), in 531 outpatients with MS, who were assessed at nine neurological practices or specialized outpatient clinics. The findings were compared with those obtained in an age-, sex- and education-matched control group of 84 healthy subjects. Episodic short-term memory was substantially decreased in the MS patients. About 20% of them reached a score of only less than two standard deviations below the mean of the control group. The episodic short-term memory score was negatively correlated with the EDSS score. Minor but also significant impairments in the MS patients were found for verbal short-term memory, episodic working memory and selective attention. The computer-based MAT was found to be useful for a routine assessment of cognition in MS outpatients.

  16. Long-term associative learning predicts verbal short-term memory performance.

    Science.gov (United States)

    Jones, Gary; Macken, Bill

    2018-02-01

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term memory system separate from long-term knowledge. Using natural language corpora, we show experimentally and computationally that performance on three widely used measures of short-term memory (digit span, nonword repetition, and sentence recall) can be predicted from simple associative learning operating on the linguistic environment to which a typical child may have been exposed. The findings support the broad view that short-term verbal memory performance reflects the application of long-term language knowledge to the experimental setting.

  17. Computers in engineering. 1988

    International Nuclear Information System (INIS)

    Tipnis, V.A.; Patton, E.M.

    1988-01-01

    These proceedings discuss the following subjects: Knowledge base systems; Computers in designing; uses of artificial intelligence; engineering optimization and expert systems of accelerators; and parallel processing in designing

  18. Mathematical challenges from theoretical/computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.

  19. Energy data base: subject thesaurus

    International Nuclear Information System (INIS)

    Redford, J.S.

    1981-10-01

    The technical staff of the DOE Technical Information Center, during its subject indexing activities, develops and structures a vocabulary that allows consistent machine storage and retrieval of information necessary to the accomplishment of the DOE mission. This thesaurus incorporates that structured vocabulary. The terminology of this thesaurus is used for the subject control of information announced in DOE Energy Research Abstracts, Energy Abstracts for Policy Analysis, and various update journals and bulletins in specialized areas. This terminology also facilitates subject searching of the DOE Energy Data Base on the DOE/RECON on-line retrieval system and on other commercial retrieval systems. The rapid expansion of the DOE's activities will result in a commitant thesaurus expansion as information relating to new activities is indexed. Only the terms used in the indexing of documents at the Technical Information Center to date are included

  20. INFLUENCE OF SUBJECT AND OBJECTIVES FACTORS OF INFORMATIZATION ON THE PEDAGOGICAL DESIGN EFFICIENCY

    Directory of Open Access Journals (Sweden)

    Tamara O. Pushkareva

    2018-02-01

    Full Text Available In this article, a preliminary terminological analysis of the conceptual apparatus in the direction of computer-oriented informatization of pedagogical designing of educational activity in the structure of general secondary education has been made. The influence of the subject and object factors of the informatization of the educational process on the efficiency of pedagogical design is investigated. The existence of a dual problem in the educational sphere, related both to the formatting of an effective informational and educational product, and to the technologically acceptable component of well-organized computer-oriented support for students, is established. It is carried out the analysis of priority in the perception of varieties of information by the subjects of the educational system, as well as revealed that in the perception of computer information by subjects of different status or age there are differences that are related to their individual needs, desires, inclinations. The means of informative support of educational process are considered and recommendations on their use are given.

  1. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  2. Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy.

    Science.gov (United States)

    Hashimoto, Yasunari; Ushiba, Junichi; Kimura, Akio; Liu, Meigen; Tomita, Yutaka

    2010-09-16

    For severely paralyzed people, a brain-computer interface (BCI) provides a way of re-establishing communication. Although subjects with muscular dystrophy (MD) appear to be potential BCI users, the actual long-term effects of BCI use on brain activities in MD subjects have yet to be clarified. To investigate these effects, we followed BCI use by a chronic tetraplegic subject with MD over 5 months. The topographic changes in an electroencephalogram (EEG) after long-term use of the virtual reality (VR)-based BCI were also assessed. Our originally developed BCI system was used to classify an EEG recorded over the sensorimotor cortex in real time and estimate the user's motor intention (MI) in 3 different limb movements: feet, left hand, and right hand. An avatar in the internet-based VR was controlled in accordance with the results of the EEG classification by the BCI. The subject was trained to control his avatar via the BCI by strolling in the VR for 1 hour a day and then continued the same training twice a month at his home. After the training, the error rate of the EEG classification decreased from 40% to 28%. The subject successfully walked around in the VR using only his MI and chatted with other users through a voice-chat function embedded in the internet-based VR. With this improvement in BCI control, event-related desynchronization (ERD) following MI was significantly enhanced (p < 0.01) for feet MI (from -29% to -55%), left-hand MI (from -23% to -42%), and right-hand MI (from -22% to -51%). These results show that our subject with severe MD was able to learn to control his EEG signal and communicate with other users through use of VR navigation and suggest that an internet-based VR has the potential to provide paralyzed people with the opportunity for easy communication.

  3. Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy

    Directory of Open Access Journals (Sweden)

    Liu Meigen

    2010-09-01

    Full Text Available Abstract Background For severely paralyzed people, a brain-computer interface (BCI provides a way of re-establishing communication. Although subjects with muscular dystrophy (MD appear to be potential BCI users, the actual long-term effects of BCI use on brain activities in MD subjects have yet to be clarified. To investigate these effects, we followed BCI use by a chronic tetraplegic subject with MD over 5 months. The topographic changes in an electroencephalogram (EEG after long-term use of the virtual reality (VR-based BCI were also assessed. Our originally developed BCI system was used to classify an EEG recorded over the sensorimotor cortex in real time and estimate the user's motor intention (MI in 3 different limb movements: feet, left hand, and right hand. An avatar in the internet-based VR was controlled in accordance with the results of the EEG classification by the BCI. The subject was trained to control his avatar via the BCI by strolling in the VR for 1 hour a day and then continued the same training twice a month at his home. Results After the training, the error rate of the EEG classification decreased from 40% to 28%. The subject successfully walked around in the VR using only his MI and chatted with other users through a voice-chat function embedded in the internet-based VR. With this improvement in BCI control, event-related desynchronization (ERD following MI was significantly enhanced (p Conclusions These results show that our subject with severe MD was able to learn to control his EEG signal and communicate with other users through use of VR navigation and suggest that an internet-based VR has the potential to provide paralyzed people with the opportunity for easy communication.

  4. Retrieval of long and short lists from long term memory: a functional magnetic resonance imaging study with human subjects.

    Science.gov (United States)

    Zysset, S; Müller, K; Lehmann, C; Thöne-Otto, A I; von Cramon, D Y

    2001-11-13

    Previous studies have shown that reaction time in an item-recognition task with both short and long lists is a quadratic function of list length. This suggests that either different memory retrieval processes are implied for short and long lists or an adaptive process is involved. An event-related functional magnetic resonance imaging study with nine subjects and list lengths varying between 3 and 18 words was conducted to identify the underlying neuronal structures of retrieval from long and short lists. For the retrieval and processing of word-lists a single fronto-parietal network, including premotor, left prefrontal, left precuneal and left parietal regions, was activated. With increasing list length, no additional regions became involved in retrieving information from long-term memory, suggesting that not necessarily different, but highly adaptive retrieval processes are involved.

  5. 33 CFR 2.34 - Waters subject to tidal influence; waters subject to the ebb and flow of the tide; mean high water.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Waters subject to tidal influence; waters subject to the ebb and flow of the tide; mean high water. 2.34 Section 2.34 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY GENERAL JURISDICTION Jurisdictional Terms § 2...

  6. Designing Infographics to support teaching complex science subject: A comparison between static and animated Infographics

    Science.gov (United States)

    Hassan, Hesham Galal

    This thesis explores the proper principles and rules for creating excellent infographics that communicate information successfully and effectively. Not only does this thesis examine the creation of Infographics, it also tries to answer which format, Static or Animated Infographics, is the most effective when used as a teaching-aid framework for complex science subjects, and if compelling Infographics in the preferred format facilitate the learning experience. The methodology includes the creation of infographic using two formats (Static and Animated) of a fairly complex science subject (Phases Of The Moon), which were then tested for their efficacy as a whole, and the two formats were compared in terms of information comprehension and retention. My hypothesis predicts that the creation of an infographic using the animated format would be more effective in communicating a complex science subject (Phases Of The Moon), specifically when using 3D computer animation to visualize the topic. This would also help different types of learners to easily comprehend science subjects. Most of the animated infographics produced nowadays are created for marketing and business purposes and do not implement the analytical design principles required for creating excellent information design. I believe that science learners are still in need of more variety in their methods of learning information, and that infographics can be of great assistance. The results of this thesis study suggests that using properly designed infographics would be of great help in teaching complex science subjects that involve spatial and temporal data. This could facilitate learning science subjects and consequently impact the interest of young learners in STEM.

  7. Computer and internet access for long-term care residents: perceived benefits and barriers.

    Science.gov (United States)

    Tak, Sunghee H; Beck, Cornelia; McMahon, Ed

    2007-05-01

    In this study, the authors examined residents' computer and Internet access, as well as benefits and barriers to access in nursing homes. Administrators of 64 nursing homes in a national chain completed surveys. Fourteen percent of the nursing homes provided computers for residents to use, and 11% had Internet access. Some residents owned personal computers in their rooms. Administrators perceived the benefits of computer and Internet use for residents as facilitating direct communication with family and providing mental exercise, education, and enjoyment. Perceived barriers included cost and space for computer equipment and residents' cognitive and physical impairments. Implications of residents' computer activities were discussed for nursing care. Further research is warranted to examine therapeutic effects of computerized activities and their cost effectiveness.

  8. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty) relationship of extension of one......This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT...... that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...

  9. Medical Image Processing for Fully Integrated Subject Specific Whole Brain Mesh Generation

    Directory of Open Access Journals (Sweden)

    Chih-Yang Hsu

    2015-05-01

    Full Text Available Currently, anatomically consistent segmentation of vascular trees acquired with magnetic resonance imaging requires the use of multiple image processing steps, which, in turn, depend on manual intervention. In effect, segmentation of vascular trees from medical images is time consuming and error prone due to the tortuous geometry and weak signal in small blood vessels. To overcome errors and accelerate the image processing time, we introduce an automatic image processing pipeline for constructing subject specific computational meshes for entire cerebral vasculature, including segmentation of ancillary structures; the grey and white matter, cerebrospinal fluid space, skull, and scalp. To demonstrate the validity of the new pipeline, we segmented the entire intracranial compartment with special attention of the angioarchitecture from magnetic resonance imaging acquired for two healthy volunteers. The raw images were processed through our pipeline for automatic segmentation and mesh generation. Due to partial volume effect and finite resolution, the computational meshes intersect with each other at respective interfaces. To eliminate anatomically inconsistent overlap, we utilized morphological operations to separate the structures with a physiologically sound gap spaces. The resulting meshes exhibit anatomically correct spatial extent and relative positions without intersections. For validation, we computed critical biometrics of the angioarchitecture, the cortical surfaces, ventricular system, and cerebrospinal fluid (CSF spaces and compared against literature values. Volumina and surface areas of the computational mesh were found to be in physiological ranges. In conclusion, we present an automatic image processing pipeline to automate the segmentation of the main intracranial compartments including a subject-specific vascular trees. These computational meshes can be used in 3D immersive visualization for diagnosis, surgery planning with haptics

  10. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  11. Entanglement temperature with Gauss–Bonnet term

    Directory of Open Access Journals (Sweden)

    Shesansu Sekhar Pal

    2015-09-01

    Full Text Available We compute the entanglement temperature using the first law-like of thermodynamics, ΔE=TentΔSEE, up to Gauss–Bonnet term in the Jacobson–Myers entropy functional in any arbitrary spacetime dimension. The computation is done when the entangling region is the geometry of a slab. We also show that such a Gauss–Bonnet term, which becomes a total derivative, when the co-dimension two hypersurface is four dimensional, does not contribute to the finite term in the entanglement entropy. We observe that the Weyl-squared term does not contribute to the entanglement entropy. It is important to note that the calculations are performed when the entangling region is very small and the energy is calculated using the normal Hamiltonian.

  12. Influence of "Halo" and "Demon" Effects in Subjective Grading.

    Science.gov (United States)

    Gibb, Gerald D.

    1983-01-01

    The phenomenon of "halo" effects in subjective grading was investigated. Two groups of three raters evaluated 20 term papers in introductory psychology. Term paper grades correlated significantly with course grades when information about previous academic performance was made available. When this information was not available, the…

  13. Spectral model for long-term computation of thermodynamics and potential evaporation in shallow wetlands

    Science.gov (United States)

    de la Fuente, Alberto; Meruane, Carolina

    2017-09-01

    Altiplanic wetlands are unique ecosystems located in the elevated plateaus of Chile, Argentina, Peru, and Bolivia. These ecosystems are under threat due to changes in land use, groundwater extractions, and climate change that will modify the water balance through changes in precipitation and evaporation rates. Long-term prediction of the fate of aquatic ecosystems imposes computational constraints that make finding a solution impossible in some cases. In this article, we present a spectral model for long-term simulations of the thermodynamics of shallow wetlands in the limit case when the water depth tends to zero. This spectral model solves for water and sediment temperature, as well as heat, momentum, and mass exchanged with the atmosphere. The parameters of the model (water depth, thermal properties of the sediments, and surface albedo) and the atmospheric downscaling were calibrated using the MODIS product of the land surface temperature. Moreover, the performance of the daily evaporation rates predicted by the model was evaluated against daily pan evaporation data measured between 1964 and 2012. The spectral model was able to correctly represent both seasonal fluctuation and climatic trends observed in daily evaporation rates. It is concluded that the spectral model presented in this article is a suitable tool for assessing the global climate change effects on shallow wetlands whose thermodynamics is forced by heat exchanges with the atmosphere and modulated by the heat-reservoir role of the sediments.

  14. Gender disparities in the association between epicardial adipose tissue volume and coronary atherosclerosis: a 3-dimensional cardiac computed tomography imaging study in Japanese subjects.

    Science.gov (United States)

    Dagvasumberel, Munkhbaatar; Shimabukuro, Michio; Nishiuchi, Takeshi; Ueno, Junji; Takao, Shoichiro; Fukuda, Daiju; Hirata, Yoichiro; Kurobe, Hirotsugu; Soeki, Takeshi; Iwase, Takashi; Kusunose, Kenya; Niki, Toshiyuki; Yamaguchi, Koji; Taketani, Yoshio; Yagi, Shusuke; Tomita, Noriko; Yamada, Hirotsugu; Wakatsuki, Tetsuzo; Harada, Masafumi; Kitagawa, Tetsuya; Sata, Masataka

    2012-09-10

    Growing evidence suggests that epicardial adipose tissue (EAT) may contribute to the development of coronary artery disease (CAD). In this study, we explored gender disparities in EAT volume (EATV) and its impact on coronary atherosclerosis. The study population consisted of 90 consecutive subjects (age: 63 ± 12 years; men: 47, women: 43) who underwent 256-slice multi-detector computed tomography (MDCT) coronary angiography. EATV was measured as the sum of cross-sectional epicardial fat area on CT images, from the lower surface of the left pulmonary artery origin to the apex. Subjects were segregated into the CAD group (coronary luminal narrowing > 50%) and non-CAD group. EATV/body surface area (BSA) was higher among men in the CAD group than in the non-CAD group (62 ± 13 vs. 33 ± 10 cm3/m2, p EATV/BSA was the single predictor for >50% coronary luminal narrowing in men (p EATV is strongly associated with coronary atherosclerosis in men.

  15. Implementing and Operating Computer Graphics in the Contemporary Chemistry Education

    Directory of Open Access Journals (Sweden)

    Olga Popovska

    2017-11-01

    Full Text Available Technology plays a crucial role in modern teaching, providing both, educators and students fundamental theoretical insights as well as supporting the interpretation of experimental data. In the long term it gives students a clear stake in their learning processes. Advancing in education furthermore largely depends on providing valuable experiences and tools throughout digital and computer literacy. Here and after, the computer’s benefit makes no exception in the chemistry as a science. The major part of computer revolutionizing in the chemistry laboratory is with the use of images, diagrams, molecular models, graphs and specialized chemistry programs. In the sense of this, the teacher provides more interactive classes and numerous dynamic teaching methods along with advanced technology. All things considered, the aim of this article is to implement interactive teaching methods of chemistry subjects using chemistry computer graphics. A group of students (n = 30 at the age of 18–20 were testing using methods such as brainstorming, demonstration, working in pairs, and writing laboratory notebooks. The results showed that demonstration is the most acceptable interactive method (95%. This article is expected to be of high value to teachers and researchers of chemistry, implementing interactive methods, and operating computer graphics.

  16. Computer Science (CS) in the Compulsory Education Curriculum: Implications for Future Research

    Science.gov (United States)

    Passey, Don

    2017-01-01

    The subject of computer science (CS) and computer science education (CSE) has relatively recently arisen as a subject for inclusion within the compulsory school curriculum. Up to this present time, a major focus of technologies in the school curriculum has in many countries been on applications of existing technologies into subject practice (both…

  17. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  18. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  19. Does the Temporal Asymmetry of Short-Term Heart Rate Variability Change during Regular Walking? A Pilot Study of Healthy Young Subjects

    Directory of Open Access Journals (Sweden)

    Xinpei Wang

    2018-01-01

    Full Text Available The acceleration and deceleration patterns in heartbeat fluctuations distribute asymmetrically, which is known as heart rate asymmetry (HRA. It is hypothesized that HRA reflects the balancing regulation of the sympathetic and parasympathetic nervous systems. This study was designed to examine whether altered autonomic balance during exercise can lead to HRA changes. Sixteen healthy college students were enrolled, and each student undertook two 5-min ECG measurements: one in a resting seated position and another while walking on a treadmill at a regular speed of 5 km/h. The two measurements were conducted in a randomized order, and a 30-min rest was required between them. RR interval time series were extracted from the 5-min ECG data, and HRA (short-term was estimated using four established metrics, that is, Porta’s index (PI, Guzik’s index (GI, slope index (SI, and area index (AI, from both raw RR interval time series and the time series after wavelet detrending that removes the low-frequency component of <~0.03 Hz. Our pilot data showed a reduced PI but unchanged GI, SI, and AI during walking compared to resting seated position based on the raw data. Based on the wavelet-detrended data, reduced PI, SI, and AI were observed while GI still showed no significant changes. The reduced PI during walking based on both raw and detrended data which suggests less short-term HRA may underline the belief that vagal tone is withdrawn during low-intensity exercise. GI may not be sensitive to short-term HRA. The reduced SI and AI based on detrended data suggest that they may capture both short- and long-term HRA features and that the expected change in short-term HRA is amplified after removing the trend that is supposed to link to long-term component. Further studies with more subjects and longer measurements are warranted to validate our observations and to examine these additional hypotheses.

  20. Subjective Evaluation of Vocal Quality in Nasal Polyposis

    Directory of Open Access Journals (Sweden)

    Ziya Saltürk

    2014-12-01

    Full Text Available Aim: Nose is a resonator organ in production of voice. The aim of this study was to evaluate the effects of nasal obstruction caused by nasal polyposis on voice quality subjectively. Methods: Thirty-six patients diagnosed with nasal polyposis were included in the study. The 30-item voice handicap index 30 was used in order to evaluate subjective status of voice. Nasal endoscopy and computed tomography imaging of the paranasal sinuses were performed for each patient. Lund-Kennedy endoscopy scores and Lund-MacKay computed tomography scores were evaluated. Control group composed of 20 healthy subjects. Results: The mean voice handicap score in the patient group was 43.16 (SD 15.53 and it was 2.15 (SD 1.92 in control group. There was a statistically significant difference between the groups (p=0.001. The mean Lund-Kennedy and Lund-Mackay scores were 8.58 (SD 2.5 and 17 (SD 5.52, respectively. It was found that increased severity of nasal polyposis was the cause for decreased satisfaction with voice quality. Conclusion: Nasal obstruction caused by nasal polyposis affects voice quality adversely and as the severity of nasal polyposis increases, satisfaction with voice quality decreases.

  1. Submaximal exercise capacity and maximal power output in polio subjects

    NARCIS (Netherlands)

    Nollet, F.; Beelen, A.; Sargeant, A. J.; de Visser, M.; Lankhorst, G. J.; de Jong, B. A.

    2001-01-01

    OBJECTIVES: To compare the submaximal exercise capacity of polio subjects with postpoliomyelitis syndrome (PPS) and without (non-PPS) with that of healthy control subjects, to investigate the relationship of this capacity with maximal short-term power and quadriceps strength, and to evaluate

  2. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    Science.gov (United States)

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  3. Short-term Automated Quantification of Radiologic Changes in the Characterization of Idiopathic Pulmonary Fibrosis Versus Nonspecific Interstitial Pneumonia and Prediction of Long-term Survival.

    Science.gov (United States)

    De Giacomi, Federica; Raghunath, Sushravya; Karwoski, Ronald; Bartholmai, Brian J; Moua, Teng

    2018-03-01

    Fibrotic interstitial lung diseases presenting with nonspecific and overlapping radiologic findings may be difficult to diagnose without surgical biopsy. We hypothesized that baseline quantifiable radiologic features and their short-term interval change may be predictive of underlying histologic diagnosis as well as long-term survival in idiopathic pulmonary fibrosis (IPF) presenting without honeycombing versus nonspecific interstitial pneumonia (NSIP). Forty biopsy-confirmed IPF and 20 biopsy-confirmed NSIP patients with available high-resolution chest computed tomography 4 to 24 months apart were studied. CALIPER software was used for the automated characterization and quantification of radiologic findings. IPF subjects were older (66 vs. 48; P<0.0001) with lower diffusion capacity for carbon monoxide and higher volumes of baseline reticulation (193 vs. 83 mL; P<0.0001). Over the interval period, compared with NSIP, IPF patients experienced greater functional decline (forced vital capacity, -6.3% vs. -1.7%; P=0.02) and radiologic progression, as noted by greater increase in reticulation volume (24 vs. 1.74 mL; P=0.048), and decrease in normal (-220 vs. -37.7 mL; P=0.045) and total lung volumes (-198 vs. 58.1 mL; P=0.03). Older age, male gender, higher reticulation volumes at baseline, and greater interval decrease in normal lung volumes were predictive of IPF. Both baseline and short-term changes in quantitative radiologic findings were predictive of mortality. Baseline quantitative radiologic findings and assessment of short-term disease progression may help characterize underlying IPF versus NSIP in those with difficult to differentiate clinicoradiologic presentations. Our study supports the possible utility of assessing serial quantifiable high-resolution chest computed tomographic findings for disease differentiation in these 2 entities.

  4. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  5. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  6. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  7. Productivity associated with visual status of computer users.

    Science.gov (United States)

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  8. Central tarsal bone fractures in horses not used for racing: Computed tomographic configuration and long-term outcome of lag screw fixation.

    Science.gov (United States)

    Gunst, S; Del Chicca, F; Fürst, A E; Kuemmerle, J M

    2016-09-01

    There are no reports on the configuration of equine central tarsal bone fractures based on cross-sectional imaging and clinical and radiographic long-term outcome after internal fixation. To report clinical, radiographic and computed tomographic findings of equine central tarsal bone fractures and to evaluate the long-term outcome of internal fixation. Retrospective case series. All horses diagnosed with a central tarsal bone fracture at our institution in 2009-2013 were included. Computed tomography and internal fixation using lag screw technique was performed in all patients. Medical records and diagnostic images were reviewed retrospectively. A clinical and radiographic follow-up examination was performed at least 1 year post operatively. A central tarsal bone fracture was diagnosed in 6 horses. Five were Warmbloods used for showjumping and one was a Quarter Horse used for reining. All horses had sagittal slab fractures that began dorsally, ran in a plantar or plantaromedial direction and exited the plantar cortex at the plantar or plantaromedial indentation of the central tarsal bone. Marked sclerosis of the central tarsal bone was diagnosed in all patients. At long-term follow-up, 5/6 horses were sound and used as intended although mild osteophyte formation at the distal intertarsal joint was commonly observed. Central tarsal bone fractures in nonracehorses had a distinct configuration but radiographically subtle additional fracture lines can occur. A chronic stress related aetiology seems likely. Internal fixation of these fractures based on an accurate diagnosis of the individual fracture configuration resulted in a very good prognosis. © 2015 EVJ Ltd.

  9. The Societal Nature of Subjectivity

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2013-01-01

    The HSR Focus presents a psycho-societal approach to qualitative empirical research in several areas of everyday social life. It is an approach which integrates a theory of subjectivity and an interpretation methodology which integrates hermeneutic experiences from text analysis and psychoanalysis....... In terms of methodology it revives the themes originally launched in FOS exactly ten years ago: "Subjectivity and Reflectivity in Qualitative Research" (Breuer, Mruck and Roth 2002; Mruck and Breuer 2003). This editorial introduction presents the intellectual background of the psycho-societal methodology......, reflects on its relevance and critical perspectives in a contemporary landscape of social science, and comments the way in which an international and interdisciplinary research group has developed this approach to profane empirical research....

  10. Characterization of the mechanism of drug-drug interactions from PubMed using MeSH terms.

    Science.gov (United States)

    Lu, Yin; Figler, Bryan; Huang, Hong; Tu, Yi-Cheng; Wang, Ju; Cheng, Feng

    2017-01-01

    Identifying drug-drug interaction (DDI) is an important topic for the development of safe pharmaceutical drugs and for the optimization of multidrug regimens for complex diseases such as cancer and HIV. There have been about 150,000 publications on DDIs in PubMed, which is a great resource for DDI studies. In this paper, we introduced an automatic computational method for the systematic analysis of the mechanism of DDIs using MeSH (Medical Subject Headings) terms from PubMed literature. MeSH term is a controlled vocabulary thesaurus developed by the National Library of Medicine for indexing and annotating articles. Our method can effectively identify DDI-relevant MeSH terms such as drugs, proteins and phenomena with high accuracy. The connections among these MeSH terms were investigated by using co-occurrence heatmaps and social network analysis. Our approach can be used to visualize relationships of DDI terms, which has the potential to help users better understand DDIs. As the volume of PubMed records increases, our method for automatic analysis of DDIs from the PubMed database will become more accurate.

  11. Long-term prognostic performance of low-dose coronary computed tomography angiography with prospective electrocardiogram triggering

    Energy Technology Data Exchange (ETDEWEB)

    Clerc, Olivier F.; Kaufmann, Basil P.; Possner, Mathias; Liga, Riccardo; Vontobel, Jan; Mikulicic, Fran; Graeni, Christoph; Benz, Dominik C.; Fuchs, Tobias A.; Stehli, Julia; Pazhenkottil, Aju P.; Gaemperli, Oliver; Kaufmann, Philipp A.; Buechel, Ronny R. [University Hospital Zurich, Cardiac Imaging, Department of Nuclear Medicine, Zurich (Switzerland)

    2017-11-15

    To assess long-term prognosis after low-dose 64-slice coronary computed tomography angiography (CCTA) using prospective electrocardiogram-triggering. We included 434 consecutive patients with suspected or known coronary artery disease referred for low-dose CCTA. Patients were classified as normal, with non-obstructive or obstructive lesions, or previously revascularized. Coronary artery calcium score (CACS) was assessed in 223 patients. Follow-up was obtained regarding major adverse cardiac events (MACE): cardiac death, myocardial infarction and elective revascularization. We performed Kaplan-Meier analysis and Cox regressions. Mean effective radiation dose was 1.7 ± 0.6 mSv. At baseline, 38% of patients had normal arteries, 21% non-obstructive lesions, 32% obstructive stenosis and 8% were revascularized. Twenty-nine patients (7%) were lost to follow-up. After a median follow-up of 6.1 ± 0.6 years, MACE occurred in 0% of patients with normal arteries, 6% with non-obstructive lesions, 30% with obstructive stenosis and 39% of those revascularized. MACE occurrence increased with increasing CACS (P < 0.001), but 4% of patients with CACS = 0 experienced MACE. Multivariate Cox regression identified obstructive stenosis, lesion burden in CCTA and CACS as independent MACE predictors (P ≤ 0.001). Low-dose CCTA with prospective electrocardiogram-triggering has an excellent long-term prognostic performance with a warranty period >6 years for patients with normal coronary arteries. (orig.)

  12. Long-term prognostic performance of low-dose coronary computed tomography angiography with prospective electrocardiogram triggering

    International Nuclear Information System (INIS)

    Clerc, Olivier F.; Kaufmann, Basil P.; Possner, Mathias; Liga, Riccardo; Vontobel, Jan; Mikulicic, Fran; Graeni, Christoph; Benz, Dominik C.; Fuchs, Tobias A.; Stehli, Julia; Pazhenkottil, Aju P.; Gaemperli, Oliver; Kaufmann, Philipp A.; Buechel, Ronny R.

    2017-01-01

    To assess long-term prognosis after low-dose 64-slice coronary computed tomography angiography (CCTA) using prospective electrocardiogram-triggering. We included 434 consecutive patients with suspected or known coronary artery disease referred for low-dose CCTA. Patients were classified as normal, with non-obstructive or obstructive lesions, or previously revascularized. Coronary artery calcium score (CACS) was assessed in 223 patients. Follow-up was obtained regarding major adverse cardiac events (MACE): cardiac death, myocardial infarction and elective revascularization. We performed Kaplan-Meier analysis and Cox regressions. Mean effective radiation dose was 1.7 ± 0.6 mSv. At baseline, 38% of patients had normal arteries, 21% non-obstructive lesions, 32% obstructive stenosis and 8% were revascularized. Twenty-nine patients (7%) were lost to follow-up. After a median follow-up of 6.1 ± 0.6 years, MACE occurred in 0% of patients with normal arteries, 6% with non-obstructive lesions, 30% with obstructive stenosis and 39% of those revascularized. MACE occurrence increased with increasing CACS (P < 0.001), but 4% of patients with CACS = 0 experienced MACE. Multivariate Cox regression identified obstructive stenosis, lesion burden in CCTA and CACS as independent MACE predictors (P ≤ 0.001). Low-dose CCTA with prospective electrocardiogram-triggering has an excellent long-term prognostic performance with a warranty period >6 years for patients with normal coronary arteries. (orig.)

  13. Perirolandic hypoperfusion on single-photon emission computed tomography in term infants with perinatal asphyxia: comparison with MRI and clinical findings

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, C.S.; Kim, D.I.; Lee, S.; Yoon, P.H.; Jeon, T.J.; Lee, J.D. [Department of Diagnostic Radiology, Yonsei University College of Medicine, Seoul (Korea); Ryu, Y.H. [Department of Diagnostic Radiology, Yonsei University College of Medicine, Seoul (Korea); Department of Nuclear Medicine, Ghil Medical Center, Gachon Medical School, Inchon (Korea); Park, C.I. [Department of Rehabilitation Medicine, Yonsei University College of Medicine, Seoul (Korea)

    2000-12-01

    We describe the findings on single-photon emission computed tomography (SPECT) in patients with perinatal asphyxia at term, with perirolandic cortico-subcortical changes on MRI, and to correlate them with clinical features. SPECT of 7 patients was obtained after injection of 185-370 MBq of Tc-99m-ECD (ethyl cysteinate dimer). The patients had spastic quadriplegia (7/7) with perinatal asphyxia (6/7) at term (7/7). The results were correlated with the MRI findings. Hypoperfusion of the perirolandic cortex was clearly seen on SPECT in all patients, even in two with subtle changes on MRI. SPECT demonstrated a more extensive area of involvement than MRI, notably in the cerebellum (in 4), the thalamus (in 7) and basal ganglia (in 5), where MRI failed to show any abnormalities. (orig.)

  14. Perirolandic hypoperfusion on single-photon emission computed tomography in term infants with perinatal asphyxia: comparison with MRI and clinical findings

    International Nuclear Information System (INIS)

    Yoon, C.S.; Kim, D.I.; Lee, S.; Yoon, P.H.; Jeon, T.J.; Lee, J.D.; Ryu, Y.H.; Park, C.I.

    2000-01-01

    We describe the findings on single-photon emission computed tomography (SPECT) in patients with perinatal asphyxia at term, with perirolandic cortico-subcortical changes on MRI, and to correlate them with clinical features. SPECT of 7 patients was obtained after injection of 185-370 MBq of Tc-99m-ECD (ethyl cysteinate dimer). The patients had spastic quadriplegia (7/7) with perinatal asphyxia (6/7) at term (7/7). The results were correlated with the MRI findings. Hypoperfusion of the perirolandic cortex was clearly seen on SPECT in all patients, even in two with subtle changes on MRI. SPECT demonstrated a more extensive area of involvement than MRI, notably in the cerebellum (in 4), the thalamus (in 7) and basal ganglia (in 5), where MRI failed to show any abnormalities. (orig.)

  15. A computational simulation of long-term synaptic potentiation inducing protocol processes with model of CA3 hippocampal microcircuit.

    Science.gov (United States)

    Świetlik, D; Białowąs, J; Kusiak, A; Cichońska, D

    2018-01-01

    An experimental study of computational model of the CA3 region presents cog-nitive and behavioural functions the hippocampus. The main property of the CA3 region is plastic recurrent connectivity, where the connections allow it to behave as an auto-associative memory. The computer simulations showed that CA3 model performs efficient long-term synaptic potentiation (LTP) induction and high rate of sub-millisecond coincidence detection. Average frequency of the CA3 pyramidal cells model was substantially higher in simulations with LTP induction protocol than without the LTP. The entropy of pyramidal cells with LTP seemed to be significantly higher than without LTP induction protocol (p = 0.0001). There was depression of entropy, which was caused by an increase of forgetting coefficient in pyramidal cells simulations without LTP (R = -0.88, p = 0.0008), whereas such correlation did not appear in LTP simulation (p = 0.4458). Our model of CA3 hippocampal formation microcircuit biologically inspired lets you understand neurophysiologic data. (Folia Morphol 2018; 77, 2: 210-220).

  16. Evaluation of Long-Term Cochlear Implant Use in Subjects With Acquired Unilateral Profound Hearing Loss: Focus on Binaural Auditory Outcomes.

    Science.gov (United States)

    Mertens, Griet; De Bodt, Marc; Van de Heyning, Paul

    Cochlear implantation (CI) in subjects with unilateral profound sensorineural hearing loss was investigated. The authors of the present study demonstrated the binaural auditory outcomes in a 12- and 36-month prospective cohort outcome study. The present study aimed to do a long-term (LT) evaluation of the auditory outcomes in an analogous study group. LT evaluation was derived from 12 single-sided deaf (SSD) CI recipients and from 11 CI recipients with asymmetric hearing loss (AHL). A structured interview was conducted with each subjects. Speech perception in noise and sound localization were assessed in a CIOFF and in a CION condition. Four binaural effects were calculated: summation effect (S0N0), squelch effect (S0NCI), combined head shadow effect (SCIN0), and spatial release from masking (SRM). At the LT evaluation, the contribution of a CI or a bone conduction device on speech perception in noise was investigated in two challenging spatial configurations in the SSD group. All (23/23) subjects wore their CI 7 days a week at LT follow-up evaluation, which ranged from 3 to 10 years after implantation. In the SSD group, a significant combined head shadow effect of 3.17 dB and an SRM benefit of 4.33 dB were found. In the AHL group, on the other hand, the summation effect (2.00 dB), the squelch effect (2.67 dB), the combined head shadow effect (3.67 dB), and SRM benefit (2.00 dB) were significant at LT testing. In both the spatial challenging configurations, the speech in noise results was significantly worse in the condition with the bone conduction device compared with the unaided condition. No negative effect was found for the CION condition. A significant benefit in the CION condition was found for sound localization compared with the CIOFF condition in the SSD group and in the AHL group. All subjects wore their CI 7 days a week at LT follow-up evaluation. The presence of binaural effects has been demonstrated with speech in noise testing, sound localization

  17. Does early change predict long-term (6 months) improvements in subjects who receive manual therapy for low back pain?

    Science.gov (United States)

    Cook, Chad; Petersen, Shannon; Donaldson, Megan; Wilhelm, Mark; Learman, Ken

    2017-09-01

    Early change is commonly assessed for manual therapy interventions and has been used to determine treatment appropriateness. However, current studies have only explored the relationship of between or within-session changes and short-/medium-term outcomes. The goal of this study was to determine whether pain changes after two weeks of pragmatic manual therapy could predict those participants with chronic low back pain who demonstrate continued improvements at 6-month follow-up. This study was a retrospective observational design. Univariate logistic regression analyses were performed using a 33% and a 50% pain change to predict improvement. Those who experienced a ≥33% pain reduction by 2 weeks had 6.98 (95% CI = 1.29, 37.53) times higher odds of 50% improvement on the GRoC and 4.74 (95% CI = 1.31, 17.17) times higher odds of 50% improvement on the ODI (at 6 months). Subjects who reported a ≥50% pain reduction at 2 weeks had 5.98 (95% CI = 1.56, 22.88) times higher odds of a 50% improvement in the GRoC and 3.99 (95% CI = 1.23, 12.88) times higher odds of a 50% improvement in the ODI (at 6 months). Future studies may investigate whether a change in plan of care is beneficial for patients who are not showing early improvement predictive of a good long-term outcome.

  18. Human subject research for engineers a practical guide

    CERN Document Server

    de Winter, Joost C F

    2017-01-01

    This Brief introduces engineers to the main principles in ethics, research design, statistics, and publishing of human subject research. In recent years, engineering has become strongly connected to disciplines such as biology, medicine, and psychology. Often, engineers (and engineering students) are expected to perform human subject research. Typical human subject research topics conducted by engineers include human-computer interaction (e.g., evaluating the usability of software), exoskeletons, virtual reality, teleoperation, modelling of human behaviour and decision making (often within the framework of ‘big data’ research), product evaluation, biometrics, behavioural tracking (e.g., of work and travel patterns, or mobile phone use), transport and planning (e.g., an analysis of flows or safety issues), etc. Thus, it can be said that knowledge on how to do human subject research is indispensable for a substantial portion of engineers. Engineers are generally well trained in calculus and mechanics, but m...

  19. Fast computation of Krawtchouk moments

    Czech Academy of Sciences Publication Activity Database

    Honarvar Shakibaei Asli, B.; Flusser, Jan

    2014-01-01

    Roč. 288, č. 1 (2014), s. 73-86 ISSN 0020-0255 R&D Projects: GA ČR GAP103/11/1552 Institutional support: RVO:67985556 Keywords : Krawtchouk polynomial * Krawtchouk moment * Geometric moment * Impulse response * Fast computation * Digital filter Subject RIV: JD - Computer Applications, Robotics Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/flusser-0432452.pdf

  20. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  1. Cobit system in the audit processes of the systems of computer systems

    Directory of Open Access Journals (Sweden)

    Julio Jhovany Santacruz Espinoza

    2017-12-01

    Full Text Available The present research work has been carried out to show the benefits of the use of the COBIT system in the auditing processes of the computer systems, the problem is related to: How does it affect the process of audits in the institutions, use of the COBIT system? The main objective is to identify the incidence of the use of the COBIT system in the auditing process used by computer systems within both public and private organizations; In order to achieve our stated objectives of the research will be developed first with the conceptualization of key terms for an easy understanding of the subject, as a conclusion: we can say the COBIT system allows to identify the methodology by using information from the IT departments, to determine the resources of the (IT Information Technology, specified in the COBIT system, such as files, programs, computer networks, including personnel that use or manipulate the information, with the purpose of providing information that the organization or company requires to achieve its objectives.

  2. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  breast cancer risk.

  3. Security in hybrid cloud computing

    OpenAIRE

    Koudelka, Ondřej

    2016-01-01

    This bachelor thesis deals with the area of hybrid cloud computing, specifically with its security. The major aim of the thesis is to analyze and compare the chosen hybrid cloud providers. For the minor aim this thesis compares the security challenges of hybrid cloud as opponent to other deployment models. In order to accomplish said aims, this thesis defines the terms cloud computing and hybrid cloud computing in its theoretical part. Furthermore the security challenges for cloud computing a...

  4. Security Dynamics of Cloud Computing

    OpenAIRE

    Khan, Khaled M.

    2009-01-01

    This paper explores various dimensions of cloud computing security. It argues that security concerns of cloud computing need to be addressed from the perspective of individual stakeholder. Security focuses of cloud computing are essentially different in terms of its characteristics and business model. Conventional way of viewing as well as addressing security such as ‘bolting-in’ on the top of cloud computing may not work well. The paper attempts to portray the security spectrum necessary for...

  5. Review on Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    P. Sumithra

    2017-03-01

    Full Text Available Computational electromagnetics (CEM is applied to model the interaction of electromagnetic fields with the objects like antenna, waveguides, aircraft and their environment using Maxwell equations.  In this paper the strength and weakness of various computational electromagnetic techniques are discussed. Performance of various techniques in terms accuracy, memory and computational time for application specific tasks such as modeling RCS (Radar cross section, space applications, thin wires, antenna arrays are presented in this paper.

  6. Personal values, subjective well-being and destination-loyalty intention of international students

    OpenAIRE

    Jamaludin, N. L.; Sam, D. L.; Sandal, G. M.; Adam, A. A.

    2016-01-01

    What are the factors that predict international students? destination-loyalty intention? This is the main question this paper addresses, using an online survey among 396 (short-term, N?=?182) and (long-term, N?=?214) international students at a Norwegian university. Structural equation model-AMOS was conducted to examine relationships among personal values, subjective well-being and destination-loyalty intentions. The results showed that: (1) universalism was positively related to subjective ...

  7. Computer systems: What the future holds

    Science.gov (United States)

    Stone, H. S.

    1976-01-01

    Developement of computer architecture is discussed in terms of the proliferation of the microprocessor, the utility of the medium-scale computer, and the sheer computational power of the large-scale machine. Changes in new applications brought about because of ever lowering costs, smaller sizes, and faster switching times are included.

  8. EXAMINATION OF THE COMPUTATIONAL THINKING SKILLS OF STUDENTS

    Directory of Open Access Journals (Sweden)

    Agah Tugrul Korucu

    2017-01-01

    Full Text Available Computational thinking is generally considered as a kind of analytical way of thinking. According to Wings (2008 it shares with mathematical thinking, engineering thinking and scientific thinking in the general ways in which we may use for solving a problem, designing and evaluating complex systems or understanding computability and intelligence as well as the mind and human behaviour. It is generally accepted important that like high order thinking skills the analytical way of thinking should be taught to the children at very early ages. The aim of this study is to investigate the computational thinking skills of secondary school students in terms of different variables. The study group of the research is 160 secondary school students who continue their education at different levels in Konya. The “Computational Thinking Skills Scale” which has been developed by Korkmaz, Çakır and Özden (2015 used for data collection. The scale includes 22 items and it is a 5 point likert type scale. The Cronbach Alpha reliability of the scale has been calculated as 0.80 and it has been found to be valid to measure the computational skills levels of the secondary school students as a result of the analysis. As a result of this research, the computational thinking skill levels of participants differ meaningfully in terms of their class levels, do not differ meaningfully in terms of their genders, do not differ meaningfully in terms of their weekly internet usage durations, do not differ meaningfully in terms of their mobile device usage competence situations, differ meaningfully in terms of their mobile Technologies possession durations.

  9. Determining the frequency of dry eye in computer users and comparing with control group

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Davari

    2017-08-01

    Full Text Available AIM: To determine the frequency of dry eye in computer users and to compare them with control group. METHODS: This study was a case control research conducted in 2015 in the city of Birjand. Sample size of study was estimated to be 304 subjects(152 subjects in each group, computer user group and control group. Non-randomized method of sampling was used in both groups. Schirmer test was used to evaluate dry eye of subjects. Then, subjects completed questionnaire. This questionnaire was developed based on objectives and reviewing the literature. After collecting the data, they were entered to SPSS Software and they were analyzed using Chi-square test or Fisher's test at the alpha level of 0.05.RESULTS: In total, 304 subjects(152 subjects in each groupwere included in the study. Frequency of dry eyes in the control group was 3.3%(5 subjectsand it was 61.8% in computer users group(94 subjects. Significant difference was observed between two groups in this regard(Pn=12, and it was 34.2% in computer users group(n=52, which significant difference was observed between two groups in this regard(PP=0.8. The mean working hour with computer per day in patients with dry eye was 6.65±3.52h, while it was 1.62±2.54h in healthy group(T=13.25, PCONCLUSION: This study showed a significant relationship between using computer and dry eye and ocular symptoms. Thus, it is necessary that officials need to pay particular attention to working hours with computer by employees. They should also develop appropriate plans to divide the working hours with computer among computer users. However, due to various confounding factors, it is recommended that these factors to be controlled in future studies.

  10. 1994 CERN school of computing. Proceedings

    International Nuclear Information System (INIS)

    Vandoni, C.E.; Verkerk, C.

    1995-01-01

    These Proceedings contain a written account of the majority of the lectures given at the 1994 CERN School of Computing. A number of lectures dealt with general aspects of computing, in particular in the areas of high performance computing in embedded systems, distributed and heterogeneous computing, multimedia information systems and on the impact of computing on High Energy Physics. Modelling and Simulation were treated with emphasis on Statistical and High Energy Physics, and a simulation package (GEANT) and its future development were presented in detail. Hardware aspects were presented, in particular in the areas of massively parallel associative string proccesors CISC vs RISC processor architectures, and a summary of an analogic supercomputer chip architecture was given. The software development process and associated technologies were the subject of full presentations. Software for Data Acquisition Systems was discussed in a number of lectures. We also reproduce, as an appendix, a set of self-explanatory transparencies used by one lecturer in a particularly detailed presentation of this subject. The H1 trigger system was presented in detail. Finally, lectures were given on a parallel program supervisor and parallel language processing generation. (orig.)

  11. Computer-communication networks

    CERN Document Server

    Meditch, James S

    1983-01-01

    Computer- Communication Networks presents a collection of articles the focus of which is on the field of modeling, analysis, design, and performance optimization. It discusses the problem of modeling the performance of local area networks under file transfer. It addresses the design of multi-hop, mobile-user radio networks. Some of the topics covered in the book are the distributed packet switching queuing network design, some investigations on communication switching techniques in computer networks and the minimum hop flow assignment and routing subject to an average message delay constraint

  12. Mid-term results of off-pump coronary artery bypass grafting assessed by multi-slice computed tomography

    International Nuclear Information System (INIS)

    Yoshida, Seijiro; Nitta, Yoshio; Oda, Katsuhiko

    2004-01-01

    Off-pump coronary artery bypass (OPCAB) has recently increased in popularity, but the long-term results are still unknown. We evaluated the mid-term results of OPCAB surgery using multi-slice computed tomography (MSCT), which is a non-invasive postoperative evaluation method. Thirty-one consecutive patients who underwent OPCAB surgery at least 2 years prior to the study were selected. The age was 50 to 79 years (66.9±6.5) and the ratio of men to women was 26:5. Coronary angiography was performed in all patients at 2 weeks postoperatively. The follow-up was complete, and mean follow-up was 30.9 months. There were no hospital deaths and 1 non-cardiac late death. The graft patency rate in coronary angiography was left internal thoracic artery (LITA) 30/30 (100%), right internal thoracic artery (RITA) 2/2 (100%), radial artery (RA) 14/15 (93%), saphenous vein graft (SVG) 15/17 (88%). No graft became occluded on MSCT study and all patients have been angina-free during the follow-up period. We suggest that OPCAB is feasible in most patients with good patency and low mortality. MSCT is an effective follow up method for the morphological findings and noninvasive quantitative evaluation of the bypass grafts. (author)

  13. System reliability analysis with natural language and expert's subjectivity

    International Nuclear Information System (INIS)

    Onisawa, T.

    1996-01-01

    This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method

  14. Handbook of mechanical engineering terms

    CERN Document Server

    Ramalingam, KK

    2009-01-01

    About the Book: The Handbook of Mechanical Engineering terms contains short, precise definitions of about four thousand terms. These terms have been collected from different sources, edited and grouped under twenty six parts and given alphabetically under each part for easy reference. The book will be a source of guidance and help to the students, staff and practising engineers in understanding and updating the subject matter. Contents: The Handbook of Mechanical Engineering terms contains short, precise definitions of about four thousand terms. These terms have been collected from differ

  15. Exact Symbol Error Probability of Square M-QAM Signaling over Generalized Fading Channels subject to Additive Generalized Gaussian Noise

    KAUST Repository

    Soury, Hamza

    2013-07-01

    This paper considers the average symbol error probability of square Quadrature Amplitude Modulation (QAM) coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closedform expression in terms of the Fox H function and the bivariate Fox H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading, Nakagami-m fading, and Rayleigh fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters.

  16. Zhang neural network for online solution of time-varying convex quadratic program subject to time-varying linear-equality constraints

    International Nuclear Information System (INIS)

    Zhang Yunong; Li Zhan

    2009-01-01

    In this Letter, by following Zhang et al.'s method, a recurrent neural network (termed as Zhang neural network, ZNN) is developed and analyzed for solving online the time-varying convex quadratic-programming problem subject to time-varying linear-equality constraints. Different from conventional gradient-based neural networks (GNN), such a ZNN model makes full use of the time-derivative information of time-varying coefficient. The resultant ZNN model is theoretically proved to have global exponential convergence to the time-varying theoretical optimal solution of the investigated time-varying convex quadratic program. Computer-simulation results further substantiate the effectiveness, efficiency and novelty of such ZNN model and method.

  17. Differences in the verbal fluency, working memory and executive functions in alcoholics: Short-term vs. long-term abstainers.

    Science.gov (United States)

    Nowakowska-Domagała, Katarzyna; Jabłkowska-Górecka, Karolina; Mokros, Łukasz; Koprowicz, Jacek; Pietras, Tadeusz

    2017-03-01

    The aim of the study was to assess differences in verbal fluency, working memory and executive functions in two subgroups of alcohol-dependent patients, those undergoing short-term abstinence (STA) and those undergoing long-term abstinence (LTA), and to compare the level of cognitive functions in patients after long-term abstinence with healthy subjects. The study group consisted of 106 alcohol-dependent patients (53 immediately after drinking at least 3 days and 53 after at least one-year abstinence). The control group comprised 53 subjects, whose age, sex and education levels matched those of the patients in the experimental group. The dependence intensity was assessed using SADD and MAST scales. The neuropsychological assessment was based on the FAS Test, Stroop Test and TMT A&B Test. The results obtained for alcohol-dependent patients revealed significant disturbances of cognitive functions. Such results indicate the presence of severe frontal cerebral cortex dysfunctions. Frontal cortex dysfunctions affecting the verbal fluency and working memory subsystems and the executive functions also persisted during long-term abstinence periods. No significant correlations between the duration of dependence, quantity of alcohol consumed and efficiency of the working memory and executive functions were observed in alcohol-dependent subjects after short-term or long-term abstinence. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  19. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Sarah; Devenish, Robin [Nuclear Physics Laboratory, Oxford University (United Kingdom)

    1989-07-15

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'.

  20. The effect of teacher interpersonal behaviour on students' subject-specific motivation

    NARCIS (Netherlands)

    den Brok, P.; Levy, J.; Brekelmans, M.; Wubbels, Th.

    2006-01-01

    This study brings together insights from research on teaching and learning in specific subjects, learning environments research and effectiveness research by linking teacher interpersonal behaviour to students’ subject-related attitudes. Teaching was studied in terms of a model originating from

  1. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  2. MINI-THESAURUS, Energy Data Base Subject Thesaurus Generator

    International Nuclear Information System (INIS)

    Paulk, J.W.

    2003-01-01

    1 - Description of program or function: MINI-THESAURUS allows the user to subset into highly-specialized 'mini-thesauri' the Energy Data Base (EDB) Subject Thesaurus, which contains the standard vocabulary of indexing terms (descriptors) developed and structured by the Office of Scientific and Technical Information (OSTI) for the building and maintenance of the U.S. Department of Energy (DOE) energy information databases. This structured vocabulary reflects the scope of DOE's research, development, and technological programs and encompasses terminology derived not only from the basic sciences but also from the areas of energy, conservation, safety, environmental impact, and regulation. Entire word blocks may be copied from the primary Subject Thesaurus, from another mini-thesaurus, or both, and subsequently modified through the addition of new terms, the deletion of existing terms, and changes to the internal relationships among the word blocks within the mini-thesaurus to create a new, special-purpose thesaurus. MINI-THESAURUS also provides the ability to copy the entire Subject Thesaurus and to treat the copy as a mini-thesaurus, thus allowing one to examine the effects of major changes to the thesaurus structure without having to modify the primary, on-line Thesaurus. The copy operation also optimizes the Subject Thesaurus structure. An interactive user having update privileges for a specific mini-thesaurus and access to the TeX and PostScript proprietary software can produce the mini-thesaurus in printed publication format. Once the mini-thesaurus has been published, periodic supplements may be generated based on date of entry or change maintained by the Thesaurus software. 2 Restrictions on the complexity of the problem: The system enforces the OSTI rules for Thesaurus development

  3. QUALITY ASSURANCE FOR CLOUD COMPUTING

    OpenAIRE

    Sumaira Aslam; Hina Shahid

    2016-01-01

    Cloud computing is a greatest and latest thing. Marketers for lots of big companies are all using cloud computing terms in their marketing campaign to make them seem them impressive so, that they can get clients and customers. Cloud computing is overall the philosophy and design concept and it is much more complicated and yet much simpler. The basic underlined thing that cloud computing do is to separate the applications from operating systems from the software from the hardware that runs eve...

  4. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Transport Protocol (Transmission Control Protocol/User Datagram Protocol [TCP/UDP]) Analysis

    Science.gov (United States)

    2015-09-01

    the network Mac8 Medium Access Control ( Mac ) (Ethernet) address observed as destination for outgoing packets subsessionid8 Zero-based index of...15. SUBJECT TERMS tactical networks, data reduction, high-performance computing, data analysis, big data 16. SECURITY CLASSIFICATION OF: 17...Integer index of row cts_deid Device (instrument) Identifier where observation took place cts_collpt Collection point or logical observation point on

  5. Long-term incidence of serious fall-related injuries after bariatric surgery in Swedish obese subjects.

    Science.gov (United States)

    Carlsson, Lena M S; Sjöholm, Kajsa; Ahlin, Sofie; Jacobson, Peter; Andersson-Assarsson, Johanna C; Karlsson Lindahl, Linda; Maglio, Cristina; Karlsson, Cecilia; Hjorth, Stephan; Taube, Magdalena; Carlsson, Björn; Svensson, Per-Arne; Peltonen, Markku

    2018-05-24

    Obesity increases risk of falling, but the effect of bariatric surgery on fall-related injuries is unknown. The aim of this study was therefore to study the association between bariatric surgery and long-term incidence of fall-related injuries in the prospective, controlled Swedish Obese Subjects study. At inclusion, body mass index was ≥ 34 kg/m 2 in men and ≥38 kg/m 2 in women. The surgery per-protocol group (n = 2007) underwent gastric bypass (n = 266), banding (n = 376), or vertical banded gastroplasty (n = 1365), and controls (n = 2040) received usual care. At the time of analysis (31 December 2013), median follow-up was 19 years (maximal 26 years). Fall-related injuries requiring hospital treatment were captured using data from the Swedish National Patient Register. During follow-up, there were 617 first-time fall-related injuries in the surgery group and 513 in the control group (adjusted hazard ratio 1.21, 95% CI, 1.07-1.36; P = 0.002). The incidence differed between treatment groups (P < 0.001, log-rank test) and was higher after gastric bypass than after usual care, banding and vertical banded gastroplasty (adjusted hazard ratio 0.50-0.52, P < 0.001 for all three comparisons). In conclusion, gastric bypass surgery was associated with increased risk of serious fall-related injury requiring hospital treatment.

  6. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  7. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    Science.gov (United States)

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  8. A brain computer interface-based explorer.

    Science.gov (United States)

    Bai, Lijuan; Yu, Tianyou; Li, Yuanqing

    2015-04-15

    In recent years, various applications of brain computer interfaces (BCIs) have been studied. In this paper, we present a hybrid BCI combining P300 and motor imagery to operate an explorer. Our system is mainly composed of a BCI mouse, a BCI speller and an explorer. Through this system, the user can access his computer and manipulate (open, close, copy, paste, and delete) files such as documents, pictures, music, movies and so on. The system has been tested with five subjects, and the experimental results show that the explorer can be successfully operated according to subjects' intentions. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Correction: Cecotti, H. and Rivet, B. Subject Combination and Electrode Selection in Cooperative Brain-Computer Interface Based on Event Related Potentials. Brain Sci. 2014, 4, 335–355

    Directory of Open Access Journals (Sweden)

    Hubert Cecotti

    2014-09-01

    Full Text Available The authors wish to make the following correction to this paper (Cecotti, H.; Rivet, B. Subject Combination and Electrode Selection in Cooperative Brain-Computer Interface Based on Event Related Potentials. Brain Sci. 2014, 4, 335–355: Due to an internal error, the reference numbers in the original published paper were not shown, and the error was not due to the authors. The former main text should be replaced as below.

  10. Gender disparities in the association between epicardial adipose tissue volume and coronary atherosclerosis: A 3-dimensional cardiac computed tomography imaging study in Japanese subjects

    Directory of Open Access Journals (Sweden)

    Dagvasumberel Munkhbaatar

    2012-09-01

    Full Text Available Abstract Background Growing evidence suggests that epicardial adipose tissue (EAT may contribute to the development of coronary artery disease (CAD. In this study, we explored gender disparities in EAT volume (EATV and its impact on coronary atherosclerosis. Methods The study population consisted of 90 consecutive subjects (age: 63 ± 12 years; men: 47, women: 43 who underwent 256-slice multi-detector computed tomography (MDCT coronary angiography. EATV was measured as the sum of cross-sectional epicardial fat area on CT images, from the lower surface of the left pulmonary artery origin to the apex. Subjects were segregated into the CAD group (coronary luminal narrowing > 50% and non-CAD group. Results EATV/body surface area (BSA was higher among men in the CAD group than in the non-CAD group (62 ± 13 vs. 33 ± 10 cm3/m2, p 3/m2, not significant. Multivariate logistic analysis showed that EATV/BSA was the single predictor for >50% coronary luminal narrowing in men (p Conclusions Increased EATV is strongly associated with coronary atherosclerosis in men.

  11. WHY ADULTS LEARN: INTERPRETING ADULTS’ REASONS TO PARTICIPATE IN EDUCATION IN TERMS OF ECCLES’ SUBJECTIVE TASK VALUE

    Directory of Open Access Journals (Sweden)

    Julia Gorges

    2016-01-01

    Full Text Available Psychological research shows that subjective task value, a basic component of expectancyvalue theory as outlined by Eccles, predicts task choice (e.g., going to graduate school. However, Eccles’ approach has not been used to investigate adult learning so far. Therefore, the present study investigated a specific form of subjective task value and task choice, namely adults’ subjective task value of participation in education. Based on expectancy-value theory, qualitative content analyses of 16 interviews with adult learners (aged between 21 and 67 from varying age groups and educational backgrounds show a differentiation of positive value according to points of reference and a revised conceptualisation of cost as an independent component of subjective task value with four subcomponents. Apparently people estimate positive value and cost separately at first and only later weigh these components against each other to arrive at an overall evaluation of subjective task value, which, in turn, predicts participation in education. Moreover, results suggest a distinction between anticipated subjective task value prior to participation and subjective task value based on experience (i.e., in hindsight. Benefits of using expectancy-value theory for future research on adults’ participation in education are discussed.

  12. Digital design and computer architecture

    CERN Document Server

    Harris, David

    2010-01-01

    Digital Design and Computer Architecture is designed for courses that combine digital logic design with computer organization/architecture or that teach these subjects as a two-course sequence. Digital Design and Computer Architecture begins with a modern approach by rigorously covering the fundamentals of digital logic design and then introducing Hardware Description Languages (HDLs). Featuring examples of the two most widely-used HDLs, VHDL and Verilog, the first half of the text prepares the reader for what follows in the second: the design of a MIPS Processor. By the end of D

  13. Optimal Multitrial Prediction Combination and Subject-Specific Adaptation for Minimal Training Brain Switch Designs

    NARCIS (Netherlands)

    Spyrou, L.; Blokland, Y.M.; Farquhar, J.D.R.; Bruhn, J.

    2016-01-01

    Brain-Computer Interface (BCI) systems are traditionally designed by taking into account user-specific data to enable practical use. More recently, subject independent (SI) classification algorithms have been developed which bypass the subject specific adaptation and enable rapid use of the system.

  14. Optimal multitrial prediction combination and subject-specific adaptation for minimal training brain switch designs

    NARCIS (Netherlands)

    Spyrou, L.; Blokland, Y.M.; Farquhar, J.D.R.; Bruhn, J.

    2016-01-01

    Brain-Computer Interface systems are traditionally designed by taking into account user-specific data to enable practical use. More recently, subject independent (SI) classification algorithms have been developed which bypass the subject specific adaptation and enable rapid use of the system. A

  15. Subjective cognitive decline: The first clinical manifestation of Alzheimer's disease?

    Directory of Open Access Journals (Sweden)

    Adalberto Studart Neto

    Full Text Available ABSTRACT Background: Mild cognitive impairment is considered as the first clinical manifestation of Alzheimer's disease (AD, when the individual exhibits below performance on standardized neuropsychological tests. However, some subjects before having a lower performance on cognitive assessments already have a subjective memory complaint. Objective: A review about subjective cognitive decline, the association with AD biomarkers and risk of conversion to dementia. Methods: We performed a comprehensive non-systematic review on PubMed. The keywords used in the search were terms related to subjective cognitive decline. Results: Subjective cognitive decline is characterized by self-experience of deterioration in cognitive performance not detected objectively through formal neuropsychological testing. However, various terms and definitions have been used in the literature and the lack of a widely accepted concept hampers comparison of studies. Epidemiological data have shown that individuals with subjective cognitive decline are at increased risk of progression to AD dementia. In addition, there is evidence that this group has a higher prevalence of positive biomarkers for amyloidosis and neurodegeneration. However, Alzheimer's disease is not the only cause of subjective cognitive decline and various other conditions can be associated with subjective memory complaints, such as psychiatric disorders or normal aging. The features suggestive of a neurodegenerative disorder are: onset of decline within the last five years, age at onset above 60 years, associated concerns about decline and confirmation by an informant. Conclusion: These findings support the idea that subjective cognitive complaints may be an early clinical marker that precedes mild cognitive impairment due to Alzheimer's disease.

  16. SuperB R&D computing program: HTTP direct access to distributed resources

    Science.gov (United States)

    Fella, A.; Bianchi, F.; Ciaschini, V.; Corvo, M.; Delprete, D.; Diacono, D.; Di Simone, A.; Franchini, P.; Donvito, G.; Giacomini, F.; Gianoli, A.; Longo, S.; Luitz, S.; Luppi, E.; Manzali, M.; Pardi, S.; Perez, A.; Rama, M.; Russo, G.; Santeramo, B.; Stroili, R.; Tomassetti, L.

    2012-12-01

    The SuperB asymmetric energy e+e- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab-1 and a luminosity target of 1036cm-2s-1. The increasing network performance also in the Wide Area Network environment and the capability to read data remotely with good efficiency are providing new possibilities and opening new scenarios in the data access field. Subjects like data access and data availability in a distributed environment are key points in the definition of the computing model for an HEP experiment like SuperB. R&D efforts in such a field have been brought on during the last year in order to release the Computing Technical Design Report within 2013. WAN direct access to data has been identified as one of the more interesting viable option; robust and reliable protocols as HTTP/WebDAV and xrootd are the subjects of a specific R&D line in a mid-term scenario. In this work we present the R&D results obtained in the study of new data access technologies for typical HEP use cases, focusing on specific protocols such as HTTP and WebDAV in Wide Area Network scenarios. Reports on efficiency, performance and reliability tests performed in a data analysis context have been described. Future R&D plan includes HTTP and xrootd protocols comparison tests, in terms of performance, efficiency, security and features available.

  17. Robotics as an integration subject in the computer science university studies. The experience of the University of Almeria

    Directory of Open Access Journals (Sweden)

    Manuela Berenguel Soria

    2012-11-01

    Full Text Available This work presents a global view of the role of robotics in computer science studies, mainly in university degrees. The main motivation of the use of robotics in these studies deals with the following issues: robotics permits to put in practice many computer science fundamental topics, it is a multidisciplinary area which allows to complete the basic knowledge of any computer science student, it facilitates the practice and learning of basic competences of any engineer (for instance, teamwork, and there is a wide market looking for people with robotics knowledge. These ideas are discussed from our own experience in the University of Almeria acquired through the studies of Computer Science Technical Engineering, Computer Science Engineering, Computer Science Degree and Computer Science Postgraduate.

  18. Scientific and General Subject Classifications in the Digital World

    CERN Document Server

    De Robbio, Antonella; Marini, A

    2001-01-01

    In the present work we discuss opportunities, problems, tools and techniques encountered when interconnecting discipline-specific subject classifications, primarily organized as search devices in bibliographic databases, with general classifications originally devised for book shelving in public libraries. We first state the fundamental distinction between topical (or subject) classifications and object classifications. Then we trace the structural limitations that have constrained subject classifications since their library origins, and the devices that were used to overcome the gap with genuine knowledge representation. After recalling some general notions on structure, dynamics and interferences of subject classifications and of the objects they refer to, we sketch a synthetic overview on discipline-specific classifications in Mathematics, Computing and Physics, on one hand, and on general classifications on the other. In this setting we present The Scientific Classifications Page, which collects groups of...

  19. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  20. Computer Science and Technology Publications. NBS Publications List 84.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  1. Computers and internet in dental education system of Kerala, South India: A multicentre study

    Directory of Open Access Journals (Sweden)

    Kanakath Harikumar

    2015-01-01

    Full Text Available Computers and internet have exerted a tremendous effect on dental education programs all over the world. A multicenter study was done to assess trends in computer and internet usage among the dental students and faculty members across the South Indian state, Kerala. A total of 347 subjects participated in the study. All participants were highly competent with the use of computers and internet. 72.3% of the study subjects preferred hard copy textbooks to PDF format books. 81.3% of the study subjects thought that internet was a useful adjunct to dental education. 73.8% of the study subjects opined that computers and internet could never be a replacement to conventional classroom teaching. Efforts should be made to provide greater infrastructure with regard to computers and internet such as Wi-Fi, free, unlimited internet access to all students and faculty members.

  2. Computing in high energy physics

    International Nuclear Information System (INIS)

    Smith, Sarah; Devenish, Robin

    1989-01-01

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'

  3. Analysis of iterative region-of-interest image reconstruction for x-ray computed tomography.

    Science.gov (United States)

    Sidky, Emil Y; Kraemer, David N; Roth, Erin G; Ullberg, Christer; Reiser, Ingrid S; Pan, Xiaochuan

    2014-10-03

    One of the challenges for iterative image reconstruction (IIR) is that such algorithms solve an imaging model implicitly, requiring a complete representation of the scanned subject within the viewing domain of the scanner. This requirement can place a prohibitively high computational burden for IIR applied to x-ray computed tomography (CT), especially when high-resolution tomographic volumes are required. In this work, we aim to develop an IIR algorithm for direct region-of-interest (ROI) image reconstruction. The proposed class of IIR algorithms is based on an optimization problem that incorporates a data fidelity term, which compares a derivative of the estimated data with the available projection data. In order to characterize this optimization problem, we apply it to computer-simulated two-dimensional fan-beam CT data, using both ideal noiseless data and realistic data containing a level of noise comparable to that of the breast CT application. The proposed method is demonstrated for both complete field-of-view and ROI imaging. To demonstrate the potential utility of the proposed ROI imaging method, it is applied to actual CT scanner data.

  4. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  5. Enhancing multilingual latent semantic analysis with term alignment information.

    Energy Technology Data Exchange (ETDEWEB)

    Chew, Peter A.; Bader, Brett William

    2008-08-01

    Latent Semantic Analysis (LSA) is based on the Singular Value Decomposition (SVD) of a term-by-document matrix for identifying relationships among terms and documents from co-occurrence patterns. Among the multiple ways of computing the SVD of a rectangular matrix X, one approach is to compute the eigenvalue decomposition (EVD) of a square 2 x 2 composite matrix consisting of four blocks with X and XT in the off-diagonal blocks and zero matrices in the diagonal blocks. We point out that significant value can be added to LSA by filling in some of the values in the diagonal blocks (corresponding to explicit term-to-term or document-to-document associations) and computing a term-by-concept matrix from the EVD. For the case of multilingual LSA, we incorporate information on cross-language term alignments of the same sort used in Statistical Machine Translation (SMT). Since all elements of the proposed EVD-based approach can rely entirely on lexical statistics, hardly any price is paid for the improved empirical results. In particular, the approach, like LSA or SMT, can still be generalized to virtually any language(s); computation of the EVD takes similar resources to that of the SVD since all the blocks are sparse; and the results of EVD are just as economical as those of SVD.

  6. Introducing Handheld Computing for Interactive Medical Education

    Directory of Open Access Journals (Sweden)

    Joseph Finkelstein

    2005-04-01

    Full Text Available The goals of this project were: (1 development of an interactive multimedia medical education tool (CO-ED utilizing modern features of handheld computing (PDA and major constructs of adult learning theories, and (2 pilot testing of the computer-assisted education in residents and clinicians. Comparison of the knowledge scores using paired t-test demonstrated statistically significant increase in subject knowledge (p<0.01 after using CO-ED. Attitudinal surveys were analyzed by total score (TS calculation represented as a percentage of a maximal possible score. The mean TS was 74.5±7.1%. None of the subjects (N=10 had TS less than 65% and in half of the subjects (N=5 TS was higher than 75%. Analysis of the semi-structured in-depth interviews showed strong support of the study subjects in using PDA as an educational tool, and high acceptance of CO-ED user interface. We concluded that PDA have a significant potential as a tool for clinician education.

  7. Computing in high energy physics

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Hoogland, W.

    1986-01-01

    This book deals with advanced computing applications in physics, and in particular in high energy physics environments. The main subjects covered are networking; vector and parallel processing; and embedded systems. Also examined are topics such as operating systems, future computer architectures and commercial computer products. The book presents solutions that are foreseen as coping, in the future, with computing problems in experimental and theoretical High Energy Physics. In the experimental environment the large amounts of data to be processed offer special problems on-line as well as off-line. For on-line data reduction, embedded special purpose computers, which are often used for trigger applications are applied. For off-line processing, parallel computers such as emulator farms and the cosmic cube may be employed. The analysis of these topics is therefore a main feature of this volume

  8. Gödel and computations: a 100th anniversary retrospective

    Czech Academy of Sciences Publication Activity Database

    Pudlák, Pavel

    2006-01-01

    Roč. 37, č. 4 (2006), s. 13-21 ISSN 0163-5700 Institutional research plan: CEZ:AV0Z10190503 Keywords : computability * lengths of proofs * computational complexity Subject RIV: BA - General Mathematics

  9. Transforming the Subject Matter: Examining the Intellectual Roots of Pedagogical Content Knowledge

    Science.gov (United States)

    Deng, Zongyi

    2007-01-01

    This article questions the basic assumptions of pedagogical content knowledge by analyzing the ideas of Jerome Bruner, Joseph Schwab, and John Dewey concerning transforming the subject matter. It argues that transforming the subject matter is not only a pedagogical but also a complex curricular task in terms of developing a school subject or a…

  10. Teaching Pervasive Computing to CS Freshmen: A Multidisciplinary Approach

    NARCIS (Netherlands)

    Silvis-Cividjian, Natalia

    2015-01-01

    Pervasive Computing is a growing area in research and commercial reality. Despite this extensive growth, there is no clear consensus on how and when to teach it to students. We report on an innovative attempt to teach this subject to first year Computer Science students. Our course combines computer

  11. Effect of Short-Term Fasting on Systemic Cytochrome P450-Mediated Drug Metabolism in Healthy Subjects: A Randomized, Controlled, Crossover Study Using a Cocktail Approach.

    Science.gov (United States)

    Lammers, Laureen A; Achterbergh, Roos; van Schaik, Ron H N; Romijn, Johannes A; Mathôt, Ron A A

    2017-10-01

    Short-term fasting can alter drug exposure but it is unknown whether this is an effect of altered oral bioavailability and/or systemic clearance. Therefore, the aim of our study was to assess the effect of short-term fasting on oral bioavailability and systemic clearance of different drugs. In a randomized, controlled, crossover trial, 12 healthy subjects received a single administration of a cytochrome P450 (CYP) probe cocktail, consisting of caffeine (CYP1A2), metoprolol (CYP2D6), midazolam (CYP3A4), omeprazole (CYP2C19) and warfarin (CYP2C9), on four occasions: an oral (1) and intravenous (2) administration after an overnight fast (control) and an oral (3) and intravenous (4) administration after 36 h of fasting. Pharmacokinetic parameters of the probe drugs were analyzed using the nonlinear mixed-effects modeling software NONMEM. Short-term fasting increased systemic caffeine clearance by 17% (p = 0.04) and metoprolol clearance by 13% (p < 0.01), whereas S-warfarin clearance decreased by 19% (p < 0.01). Fasting did not affect bioavailability. The study demonstrates that short-term fasting alters CYP-mediated drug metabolism in a non-uniform pattern without affecting oral bioavailability.

  12. Fuzzy Neuroidal Nets and Recurrent Fuzzy Computations

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    2001-01-01

    Roč. 11, č. 6 (2001), s. 675-686 ISSN 1210-0552. [SOFSEM 2001 Workshop on Soft Computing. Piešťany, 29.11.2001-30.11.2001] R&D Projects: GA ČR GA201/00/1489; GA AV ČR KSK1019101 Institutional research plan: AV0Z1030915 Keywords : fuzzy computing * fuzzy neural nets * fuzzy Turing machines * non-uniform computational complexity Subject RIV: BA - General Mathematics

  13. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Directory of Open Access Journals (Sweden)

    Kevin S Bonham

    2017-10-01

    Full Text Available While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  14. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Science.gov (United States)

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  15. FCJ-131 Pervasive Computing and Prosopopoietic Modelling – Notes on computed function and creative action

    Directory of Open Access Journals (Sweden)

    Anders Michelsen

    2011-12-01

    Full Text Available This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997 terms, ‘invisible’, on the horizon, ’calm’, it also points to a much more important and slightly different perspective: that of creative action upon novel forms of artifice. Most importantly for this article, ubiquity and pervasive computing is seen to point to the continuous existence throughout the computational heritage since the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty relationship of extension of one into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic

  16. Audit and Evaluation of Computer Security. Computer Science and Technology.

    Science.gov (United States)

    Ruthberg, Zella G.

    This is a collection of consensus reports, each produced at a session of an invitational workshop sponsored by the National Bureau of Standards. The purpose of the workshop was to explore the state-of-the-art and define appropriate subjects for future research in the audit and evaluation of computer security. Leading experts in the audit and…

  17. Short term high dose atorvastatin for the prevention of contrast-induced nephropathy in patients undergoing computed tomography angiography

    Directory of Open Access Journals (Sweden)

    Hamid Sanei

    2014-09-01

    Full Text Available BACKGROUND: Statins are shown effective by some studies in preventing contrast-induced nephropathy (CIN. We evaluated the effectiveness of atorvastatin in the prevention of CIN in computed tomography angiography (CTA candidates. METHODS: This study was conducted on patients referring for elective CTA with normal renal function. Patients received atorvastatin (80 mg/day or placebo from 24 h before to 48 h after administration of the contrast material. Serum creatinine was measured before and 48 h after contrast material injection. CIN was defined as an increase in serum creatinine level of ≥ 0.5 mg/dl or ≥ 25% of the baseline creatinine. RESULTS: A total of 236 patients completed the study; 115 atorvastatin, 121 placebo, mean age = 58.40 ± 9.80 year, 68.6% male. Serum creatinine increased after contrast material injection in both the atorvastatin (1.00 ± 0.16-1.02 ± 0.15 mg/dl, P = 0.017 and placebo groups (1.03 ± 0.17-1.08 ± 0.18 mg/dl, P < 0.001. Controlling for age, gender, comorbidities, drug history, and baseline serum creatinine level, patients who received atorvastatin experienced less increase in serum creatinine after contrast material injection (beta = 0.127, P = 0.034. However, there was no difference between the atorvastatin and placebo groups in the incidence of CIN (4.3 vs. 5.0%, P = 0.535. CONCLUSION: In patients undergoing CTA, a short-term treatment with high dose atorvastatin is effective in preventing contrast-induced renal dysfunction, in terms of less increase in serum creatinine level after contrast material injection. Further trials including larger sample of patients and longer follow-ups are warranted.   Keywords: Kidney Diseases, Multidetector Computed Tomography, Contrast Media, Hydroxymethylglutaryl-CoA Reductase Inhibitors, Atorvastatin 

  18. Computers in Cardiology.

    Science.gov (United States)

    Feldman, Charles L.

    The utilization of computers in the interpretation of electrocardiograms (EKG's) and vectorcardiograms is the subject of this report. A basic introduction into the operations of the electrocardiograms and vectorcardiograms is provided via an illustrated text. A historical development of the EKG starts with the 1950's with the first attempts to use…

  19. Motor priming in virtual reality can augment motor-imagery training efficacy in restorative brain-computer interaction: a within-subject analysis.

    Science.gov (United States)

    Vourvopoulos, Athanasios; Bermúdez I Badia, Sergi

    2016-08-09

    The use of Brain-Computer Interface (BCI) technology in neurorehabilitation provides new strategies to overcome stroke-related motor limitations. Recent studies demonstrated the brain's capacity for functional and structural plasticity through BCI. However, it is not fully clear how we can take full advantage of the neurobiological mechanisms underlying recovery and how to maximize restoration through BCI. In this study we investigate the role of multimodal virtual reality (VR) simulations and motor priming (MP) in an upper limb motor-imagery BCI task in order to maximize the engagement of sensory-motor networks in a broad range of patients who can benefit from virtual rehabilitation training. In order to investigate how different BCI paradigms impact brain activation, we designed 3 experimental conditions in a within-subject design, including an immersive Multimodal Virtual Reality with Motor Priming (VRMP) condition where users had to perform motor-execution before BCI training, an immersive Multimodal VR condition, and a control condition with standard 2D feedback. Further, these were also compared to overt motor-execution. Finally, a set of questionnaires were used to gather subjective data on Workload, Kinesthetic Imagery and Presence. Our findings show increased capacity to modulate and enhance brain activity patterns in all extracted EEG rhythms matching more closely those present during motor-execution and also a strong relationship between electrophysiological data and subjective experience. Our data suggest that both VR and particularly MP can enhance the activation of brain patterns present during overt motor-execution. Further, we show changes in the interhemispheric EEG balance, which might play an important role in the promotion of neural activation and neuroplastic changes in stroke patients in a motor-imagery neurofeedback paradigm. In addition, electrophysiological correlates of psychophysiological responses provide us with valuable information

  20. 1984 CERN school of computing

    International Nuclear Information System (INIS)

    1985-01-01

    The eighth CERN School of Computing covered subjects mainly related to computing for elementary-particle physics. These proceedings contain written versions of most of the lectures delivered at the School. Notes on the following topics are included: trigger and data-acquisition plans for the LEP experiments; unfolding methods in high-energy physics experiments; Monte Carlo techniques; relational data bases; data networks and open systems; the Newcastle connection; portable operating systems; expert systems; microprocessors - from basic chips to complete systems; algorithms for parallel computers; trends in supercomputers and computational physics; supercomputing and related national projects in Japan; application of VLSI in high-energy physics, and single-user systems. See hints under the relevant topics. (orig./HSI)

  1. Verbal short-term memory and vocabulary learning in polyglots.

    Science.gov (United States)

    Papagno, C; Vallar, G

    1995-02-01

    Polyglot and non-polyglot Italian subjects were given tests assessing verbal (phonological) and visuo-spatial short-term and long-term memory, general intelligence, and vocabulary knowledge in their native language. Polyglots had a superior level of performance in verbal short-term memory tasks (auditory digit span and nonword repetition) and in a paired-associate learning test, which assessed the subjects' ability to acquire new (Russian) words. By contrast, the two groups had comparable performance levels in tasks assessing general intelligence, visuo-spatial short-term memory and learning, and paired-associate learning of Italian words. These findings, which are in line with neuropsychological and developmental evidence, as well as with data from normal subjects, suggest a close relationship between the capacity of phonological memory and the acquisition of foreign languages.

  2. Improved estimation of subject-level functional connectivity using full and partial correlation with empirical Bayes shrinkage.

    Science.gov (United States)

    Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A

    2018-05-15

    Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully

  3. Achievement Goal Orientations and Subjective Well-Being: A Person-Centred Analysis

    Science.gov (United States)

    Tuominen-Soini, Heta; Salmela-Aro, Katariina; Niemivirta, Markku

    2008-01-01

    This study examined whether students with different achievement goal orientation profiles differ in terms of subjective well-being (i.e., self-esteem, depressive symptoms, school-related burnout, and educational goal appraisals). Six groups of students with unique motivational profiles were identified. Observed differences in subjective well-being…

  4. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  5. Capturing the semiotic relationship between terms

    Science.gov (United States)

    Hargood, Charlie; Millard, David E.; Weal, Mark J.

    2010-04-01

    Tags describing objects on the web are often treated as facts about a resource, whereas it is quite possible that they represent more subjective observations. Existing methods of term expansion expand terms based on dictionary definitions or statistical information on term occurrence. Here we propose the use of a thematic model for term expansion based on semiotic relationships between terms; this has been shown to improve a system's thematic understanding of content and tags and to tease out the more subjective implications of those tags. Such a system relies on a thematic model that must be made by hand. In this article, we explore a method to capture a semiotic understanding of particular terms using a rule-based guide to authoring a thematic model. Experimentation shows that it is possible to capture valid definitions that can be used for semiotic term expansion but that the guide itself may not be sufficient to support this on a large scale. We argue that whilst the formation of super definitions will mitigate some of these problems, the development of an authoring support tool may be necessary to solve others.

  6. Amorphous Computing: A Research Agenda for the Near Future

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    2012-01-01

    Roč. 11, č. 1 (2012), s. 59-63 ISSN 1567-7818 R&D Projects: GA ČR GAP202/10/1333 Institutional research plan: CEZ:AV0Z10300504 Keywords : amorphous computing * nano-machines * flying amorphous computer Subject RIV: IN - Informatics, Computer Science Impact factor: 0.683, year: 2012

  7. COMPUTER SCIENCE IN THE EDUCATION OF UKRAINE: FORMATION PROSPECTS

    OpenAIRE

    Viktor Shakotko

    2016-01-01

    The article deals with the formation of computer science as science and school subject as well in the system of education in Ukraine taking into consideration the development tendencies of this science in the world. The introduction of the notion« information technology», «computer science» and «informatics science» into the science, their correlation and the peculiarities of subject sphere determination are analyzed through the historical aspect. The author considers the points of view conce...

  8. Computer-Assisted Evaluation of Videokymographic Data

    Czech Academy of Sciences Publication Activity Database

    Novozámský, Adam; Sedlář, Jiří; Zita, A.; Švec, J. G.; Zitová, Barbara; Flusser, Jan; Hauzar, D.

    2013-01-01

    Roč. 1, č. 1 (2013), s. 49-49 ISSN 1805-8698. [EFMI STC Prague Data and Knowledge for Medical Decision Support. 17.04.2013-19.04.2013, Praha] Institutional support: RVO:67985556 Keywords : videokymography * image process ing * computerassisted evaluation Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2013/ZOI/novozamsky-computer-assisted evaluation of videokymographic data.pdf

  9. Principles of Tablet Computing for Educators

    Science.gov (United States)

    Katzan, Harry, Jr.

    2015-01-01

    In the study of modern technology for the 21st century, one of the most popular subjects is tablet computing. Tablet computers are now used in business, government, education, and the personal lives of practically everyone--at least, it seems that way. As of October 2013, Apple has sold 170 million iPads. The success of tablets is enormous and has…

  10. Foraging for brain stimulation: toward a neurobiology of computation.

    Science.gov (United States)

    Gallistel, C R

    1994-01-01

    The self-stimulating rat performs foraging tasks mediated by simple computations that use interreward intervals and subjective reward magnitudes to determine stay durations. This is a simplified preparation in which to study the neurobiology of the elementary computational operations that make cognition possible, because the neural signal specifying the value of a computationally relevant variable is produced by direct electrical stimulation of a neural pathway. Newly developed measurement methods yield functions relating the subjective reward magnitude to the parameters of the neural signal. These measurements also show that the decision process that governs foraging behavior divides the subjective reward magnitude by the most recent interreward interval to determine the preferability of an option (a foraging patch). The decision process sets the parameters that determine stay durations (durations of visits to foraging patches) so that the ratios of the stay durations match the ratios of the preferabilities.

  11. Important prognostic factors for the long-term survival of lung cancer subjects in Taiwan

    International Nuclear Information System (INIS)

    Chiang, Tai-An; Chen, Ping-Ho; Wu, Pei-Fen; Wang, Tsu-Nai; Chang, Po-Ya; Ko, Albert Min-Shan; Huang, Ming-Shyan; Ko, Ying-Chin

    2008-01-01

    This study used a large-scale cancer database in determination of prognostic factors for the survival of lung cancer subjects in Taiwan. Total of 24,910 subjects diagnosed with lung cancer was analysed. Survival estimates by Kaplan-Meier methods. Cox proportional-hazards model estimated the death risk (hazard ratio (HR)) for various prognostic factors. The prognostic indicators associated with a higher risk of lung cancer deaths are male gender (males versus females; HR = 1.07, 95% confidence intervals (CI): 1.03–1.11), males diagnosed in later periods (shown in 1991–1994 versus 1987–1990; HR = 1.13), older age at diagnosis, large cell carcinoma (LCC)/small cell carcinoma (SCC), and supportive care therapy over chemotherapy. The overall 5-year survival rate for lung cancer death was significantly poorer for males (21.3%) than females (23.6%). Subjects with squamous cell carcinoma (SQCC) and treatment by surgical resection alone had better prognosis. We find surgical resections to markedly increase 5-year survival rate from LCC, decreased risk of death from LCC, and no improved survival from SCC. Gender and clinical characteristics (i.e. diagnostic period, diagnostic age, histological type and treatment modality) play important roles in determining lung cancer survival

  12. Effect of short-term fasting on lipolytic responsiveness in normal and obese human subjects

    International Nuclear Information System (INIS)

    Wolfe, R.R.; Peters, E.J.; Klein, S.; Holland, O.B.; Rosenblatt, J.; Gary, H. Jr.

    1987-01-01

    In this study the rate of lipolysis (fatty acid and glycerol release into blood) has been quantified in both normal weight and obese volunteers after both 15 and 87 h of fasting. In each study, the basal rate and subsequent response to epinephrine infusion were determined. The rate of appearance (R/sub a/) of free fatty acids (FFA) and glycerol were quantified by infusion of [1- 13 C]palmitate and D-5-glycerol, respectively. Substrate flux rates per unit of body fat mass and lean body mass were calculated from total body water measurements using H 2 18 O dilution. In normal volunteers, the basal R/sub a/ FFA and R/sub a/ glycerol rose markedly with 87 h of fasting, whereas the increases were more modest in the obese subjects. However, the rate of mobilization of fat, in relation to the lean body mass, was higher in the obese subjects than in the normal subjects after 15 h of fasting, and the values were similar in both groups after 87 h of fasting. There was an increased lipolytic response to epinephrine after fasting in both groups. This increased sensitivity may have resulted from the enhancement of fatty acid-triglyceride substrate cycling that occurred after fasting

  13. Short term effects of kinesiotaping on acromiohumeral distance in asymptomatic subjects: a randomised controlled trial.

    Science.gov (United States)

    Luque-Suarez, A; Navarro-Ledesma, S; Petocz, P; Hancock, M J; Hush, J

    2013-12-01

    The first aim of this study was to investigate whether kinesiotaping (KT) can increase the acromiohumeral distance (AHD) in asymptomatic subjects in the short term. The second aim was to investigate whether the direction of kinesiotaping application influences AHD. In recent years, the use of KT has become increasingly popular for a range of musculoskeletal conditions and for sport injuries. To date, we are unaware of any research investigating the effect of kinesiotaping on AHD. Moreover, it is unknown whether the direction of kinesiotaping application for the shoulder is important. Forty nine participants were randomly assigned to one of three groups: kinesiotaping group 1 (KT1), kinesiotaping group 2 (KT2) and sham kinesiotaping (KT3). AHD ultrasound measurements at 0° and 60° of shoulder elevation were collected at baseline and immediately after kinesiotape application. The results showed significant improvements in AHD after kinesiotaping, compared with sham taping. The mean difference in AHD between KT1 and KT3 groups was 1.28 mm (95% CI: 0.55, 2.03), and between KT2 and KT3 was 0.98 mm (95% CI: 0.23, 1.74). Comparison of KT1 and KT2 groups, which was performed to identify whether the direction of taping influences the AHD, indicated there were no significant differences. KT increases AHD in healthy individuals immediately following application, compared with sham kinesiotape. No differences were found with respect to the direction in which KT was applied. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. A quantum computer only needs one universe

    OpenAIRE

    Steane, A. M.

    2000-01-01

    The nature of quantum computation is discussed. It is argued that, in terms of the amount of information manipulated in a given time, quantum and classical computation are equally efficient. Quantum superposition does not permit quantum computers to ``perform many computations simultaneously'' except in a highly qualified and to some extent misleading sense. Quantum computation is therefore not well described by interpretations of quantum mechanics which invoke the concept of vast numbers of ...

  15. Characterization of the mechanism of drug-drug interactions from PubMed using MeSH terms.

    Directory of Open Access Journals (Sweden)

    Yin Lu

    Full Text Available Identifying drug-drug interaction (DDI is an important topic for the development of safe pharmaceutical drugs and for the optimization of multidrug regimens for complex diseases such as cancer and HIV. There have been about 150,000 publications on DDIs in PubMed, which is a great resource for DDI studies. In this paper, we introduced an automatic computational method for the systematic analysis of the mechanism of DDIs using MeSH (Medical Subject Headings terms from PubMed literature. MeSH term is a controlled vocabulary thesaurus developed by the National Library of Medicine for indexing and annotating articles. Our method can effectively identify DDI-relevant MeSH terms such as drugs, proteins and phenomena with high accuracy. The connections among these MeSH terms were investigated by using co-occurrence heatmaps and social network analysis. Our approach can be used to visualize relationships of DDI terms, which has the potential to help users better understand DDIs. As the volume of PubMed records increases, our method for automatic analysis of DDIs from the PubMed database will become more accurate.

  16. Symbolic initiative and its application to computers

    Energy Technology Data Exchange (ETDEWEB)

    Hellerman, L

    1982-01-01

    The author reviews the role of symbolic initiative in mathematics and then defines a sense in which computers compute mathematical functions. This allows a clarification of the semantics of computer and communication data. Turing's view of machine intelligence is examined in terms of its use of symbolic initiative. 12 references.

  17. Prediction of short-term and long-term VOC emissions from SBR bitumen-backed carpet under different temperatures

    NARCIS (Netherlands)

    Yang, X.; Chen, Q.; Bluyssen, P.M.

    1998-01-01

    This paper presents two models for volatile organic compound (VOC) emissions from carpet. One is a numerical model using the computational fluid dynamics (CFD) tech-nique for short-term predictions, the other an analytical model for long-term predictions. The numerical model can (1) deal with

  18. Kinesio taping and manual pressure release: Short-term effects in subjects with myofasical trigger point.

    Science.gov (United States)

    Chao, Yu Wen; Lin, Jiu Jenq; Yang, Jing Lan; Wang, Wendy Tzyy-Jiuan

    2016-01-01

    Randomized controlled trial. Myofascial pain syndrome is characterized by myofascial trigger points (MTrPs) and fascia tenderness. We investigated the effects of manual pressure release (MPR) alone or in combination with taping (MPR/MKT) in subjects with MTrPs. Fifteen and 16 subjects received MPR and MPR/MKT respectively. Outcomes including Pressure pain threshold, muscle stiffness, mechanomyography were assessed at baseline, post-intervention and 7-days later. Pressure pain threshold improved significantly (d = 1.79, p < 0.005) in both groups. Significant improvement in muscle stiffness in the MPR/MKT group (0.27-0.49 mm) as compared to the MPR group (-0.02-0.23 mm). Mechanomyography amplitude in the MPR/MKT group was significantly higher than that of the MPR group (p < 0.05). MPR and MPR/MKT are effective in reducing pain in these subjects. MPR/MKT has a greater effect on muscle stiffness and contraction amplitude. IV. Copyright © 2016 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  19. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  20. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  1. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    Science.gov (United States)

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Long term stability of power systems

    Energy Technology Data Exchange (ETDEWEB)

    Kundur, P; Gao, B [Powertech Labs. Inc., Surrey, BC (Canada)

    1994-12-31

    Power system long term stability is still a developing subject. In this paper we provide our perspectives and experiences related to long term stability. The paper begins with the description of the nature of the long term stability problem, followed by the discussion of issues related to the modeling and solution techniques of tools for long term stability analysis. Cases studies are presented to illustrate the voltage stability aspect and plant dynamics aspect of long term stability. (author) 20 refs., 11 figs.

  3. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  4. The "Subject of Ethics" and Educational Research OR Ethics or Politics? Yes Please!

    Science.gov (United States)

    Bazzul, Jesse

    2017-01-01

    This paper outlines a theoretical context for research into "the subject of ethics" in terms of how students come to see themselves as self-reflective actors. I maintain that the "subject of ethics," or ethical subjectivity, has been overlooked as a necessary aspect of creating politically transformative spaces in education. At…

  5. Discrete and computational geometry

    CERN Document Server

    Devadoss, Satyan L

    2011-01-01

    Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well as more recent subjects like pseudotriangulations, curve reconstruction, and locked chains. It also touches on more advanced material, including Dehn invariants, associahedra, quasigeodesics, Morse theory, and the recent resolution of the Poincaré conjecture. Connections to real-world applications are made throughout, and algorithms are presented independently of any programming language. This richly illustrated textbook also fe...

  6. Encyclopedia of cloud computing

    CERN Document Server

    Bojanova, Irena

    2016-01-01

    The Encyclopedia of Cloud Computing provides IT professionals, educators, researchers and students with a compendium of cloud computing knowledge. Authored by a spectrum of subject matter experts in industry and academia, this unique publication, in a single volume, covers a wide range of cloud computing topics, including technological trends and developments, research opportunities, best practices, standards, and cloud adoption. Providing multiple perspectives, it also addresses questions that stakeholders might have in the context of development, operation, management, and use of clouds. Furthermore, it examines cloud computing's impact now and in the future. The encyclopedia presents 56 chapters logically organized into 10 sections. Each chapter covers a major topic/area with cross-references to other chapters and contains tables, illustrations, side-bars as appropriate. Furthermore, each chapter presents its summary at the beginning and backend material, references and additional resources for further i...

  7. 2012 International Conference on Human-centric Computing

    CERN Document Server

    Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

    2012-01-01

    The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

  8. Ergonomics in the computer workstation | Karoney | East African ...

    African Journals Online (AJOL)

    Background: Awareness of effects of long term use of computer and application of ergonomics in the computer workstation is important for preventing musculoskeletal disorders, eyestrain and psychosocial effects. Objectives: To determine the awareness of ºphysical and psychological effects of prolonged computer usage ...

  9. TO STUDY THE ROLE OF ERGONOMICS IN THE MANAGEMENT OF COMPUTER VISION SYNDROME

    Directory of Open Access Journals (Sweden)

    Anshu

    2016-03-01

    Full Text Available INTRODUCTION Ergonomics is the science of designing the job equipment and workplace to fit the worker by obtaining a correct match between the human body, work related tasks and work tools. By applying the science of ergonomics we can reduce the difficulties faced by computer users. OBJECTIVES To evaluate the efficacy of tear substitutes and the role of ergonomics in the management of Computer Vision Syndrome. Development of counseling plan, initial treatment plan, prevent complications and educate the subjects about the disease process and to enhance public awareness. MATERIALS AND METHODS A minimum of 100 subjects were selected randomly irrespective of gender, place and nature of computer work & ethnic differences. The subjects were between age group of 10-60 years who had been using the computer for a minimum of 2 hours/day for atleast 5-6 days a week. The subjects underwent tests like Schirmer's, Test film breakup time (TBUT, Inter Blink Interval and Ocular surface staining. A Computer Vision score was taken out based on 5 symptoms each of which was given a score of 2. The symptoms included foreign body sensation, redness, eyestrain, blurring of vision and frequent change in refraction. The score of more than 6 was treated as Computer Vision syndrome and the subjects underwent synoptophore tests and refraction. RESULT In the present study where we had divided 100 subjects into 2 groups of 50 each and given tear substitutes only in one group and ergonomics was considered with tear substitutes in the other. We saw that there was more improvement after 4 weeks and 8 weeks in the group taking lubricants and ergonomics into consideration than lubricants alone. More improvement was seen in eyestrain and blurring (P0.05. CONCLUSION Advanced training in proper computer usage can decrease discomfort.

  10. Soft Computing. Nové informatické paradigma, nebo módní slogan?

    Czech Academy of Sciences Publication Activity Database

    Hájek, Petr

    2000-01-01

    Roč. 79, č. 12 (2000), s. 683-685 ISSN 0042-4544 Institutional research plan: AV0Z1030915 Keywords : soft computing * fuzzy computing * neural computing * generic computing Subject RIV: BA - General Mathematics

  11. The Societal Nature of Subjectivity: An Interdisciplinary Methodological Challenge

    Directory of Open Access Journals (Sweden)

    Henning Salling Olesen

    2012-09-01

    Full Text Available The thematic issue presents a psycho-societal approach to qualitative empirical research in several areas of everyday social life. It is an approach which integrates a theory of subjectivity and an interpretation methodology which integrates hermeneutic experiences from text analysis and psychoanalysis. Its particular focus is on subjectivity—as an aspect of the research object and as an aspect of the research process. By the term "approach" is indicated the intrinsic connection between the theorizing of an empirical object and the reflection of the research process and the epistemic subject. In terms of methodology it revives the themes originally launched in FQS exactly ten years ago: "Subjectivity and Reflectivity in Qualitative Research" (BREUER, MRUCK & ROTH, 2002; MRUCK & BREUER, 2003. This editorial introduction presents the intellectual background of the psycho-societal methodology, reflects on its relevance and critical perspectives in a contemporary landscape of social science, and comments the way in which an international and interdisciplinary research group has developed this approach to profane empirical research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120345

  12. Long-term clearance from small airways in subjects with ciliary dysfunction

    OpenAIRE

    Hjelte Lena; Falk Rolf; Lindström Maria; Philipson Klas; Svartengren Magnus

    2006-01-01

    Abstract The objective of this study was to investigate if long-term clearance from small airways is dependent on normal ciliary function. Six young adults with primary ciliary dyskinesia (PCD) inhaled 111 Indium labelled Teflon particles of 4.2 μm geometric and 6.2 μm aerodynamic diameter with an extremely slow inhalation flow, 0.05 L/s. The inhalation method deposits particles mainly in the small conducting airways. Lung retention was measured immediately after inhalation and at four occasi...

  13. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  14. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  15. Performance monitoring for brain-computer-interface actions.

    Science.gov (United States)

    Schurger, Aaron; Gale, Steven; Gozel, Olivia; Blanke, Olaf

    2017-02-01

    When presented with a difficult perceptual decision, human observers are able to make metacognitive judgements of subjective certainty. Such judgements can be made independently of and prior to any overt response to a sensory stimulus, presumably via internal monitoring. Retrospective judgements about one's own task performance, on the other hand, require first that the subject perform a task and thus could potentially be made based on motor processes, proprioceptive, and other sensory feedback rather than internal monitoring. With this dichotomy in mind, we set out to study performance monitoring using a brain-computer interface (BCI), with which subjects could voluntarily perform an action - moving a cursor on a computer screen - without any movement of the body, and thus without somatosensory feedback. Real-time visual feedback was available to subjects during training, but not during the experiment where the true final position of the cursor was only revealed after the subject had estimated where s/he thought it had ended up after 6s of BCI-based cursor control. During the first half of the experiment subjects based their assessments primarily on the prior probability of the end position of the cursor on previous trials. However, during the second half of the experiment subjects' judgements moved significantly closer to the true end position of the cursor, and away from the prior. This suggests that subjects can monitor task performance when the task is performed without overt movement of the body. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Comparing genomes to computer operating systems in terms of the topology and evolution of their regulatory control networks.

    Science.gov (United States)

    Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P; Gerstein, Mark

    2010-05-18

    The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers' continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems.

  17. Gut microbiome response to short-term dietary interventions in reactive hypoglycemia subjects.

    Science.gov (United States)

    Quercia, Sara; Turroni, Silvia; Fiori, Jessica; Soverini, Matteo; Rampelli, Simone; Biagi, Elena; Castagnetti, Andrea; Consolandi, Clarissa; Severgnini, Marco; Pianesi, Mario; Fallucca, Francesco; Pozzilli, Paolo; Brigidi, Patrizia; Candela, Marco

    2017-11-01

    Reactive hypoglycemia is a metabolic disorder that provokes severe hypoglycemic episodes after meals. Over recent years, the gut microbiota has been recognized as potential target for the control of metabolic diseases, and the possibility to correct gut microbiota dysbioses through diet, favouring the recovery of metabolic homeostasis, has been considered. We investigate the impact of 2 short-term (3-day) nutritional interventions, based on the macrobiotic Ma-Pi 2 diet and a control Mediterranean diet, on the structure and functionality of the gut microbiota in 12 patients affected by reactive hypoglycemia. The gut microbiota composition was characterized by next-generation sequencing of the V3 to V4 region of the 16S rRNA gene, and the ecosystem functionality was addressed by measuring the faecal concentration of short-chain fatty acids (SCFAs). In order to measure the short-term physiological gut microbiota fluctuation, the microbiomes of 7 healthy people were characterized before and after 3 days of constant diet. While no convergence of the gut microbiota compositional profiles was observed, a significant increase in SCFA faecal levels was induced only in the Ma-Pi 2 diet group, suggesting the potential of this diet to support a short-term functional convergence of the gut microbiota, regardless of the individual compositional layout. The Ma-Pi 2 diet, with its high fibre load, was effective in increasing the production of SCFAs by the gut microbiota. Because these metabolites are known for their ability to counterbalance the metabolic deregulation in persons with glucose impairment disorders, their increased bioavailability could be of some relevance in reactive hypoglycemia. Copyright © 2017 John Wiley & Sons, Ltd.

  18. How accurate are adolescents in portion-size estimation using the computer tool Young Adolescents' Nutrition Assessment on Computer (YANA-C)?

    Science.gov (United States)

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-06-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amounts of ten commonly consumed foods (breakfast cereals, French fries, pasta, rice, apple sauce, carrots and peas, crisps, creamy velouté, red cabbage, and peas). Two procedures were followed: (1) short-term recall: adolescents (n 73) self-served their usual portions of the ten foods and estimated the amounts later the same day; (2) real-time perception: adolescents (n 128) estimated two sets (different portions) of pre-weighed portions displayed near the computer. Self-served portions were, on average, 8 % underestimated; significant underestimates were found for breakfast cereals, French fries, peas, and carrots and peas. Spearman's correlations between the self-served and estimated weights varied between 0.51 and 0.84, with an average of 0.72. The kappa statistics were moderate (>0.4) for all but one item. Pre-weighed portions were, on average, 15 % underestimated, with significant underestimates for fourteen of the twenty portions. Photographs of food items can serve as a good aid in ranking subjects; however, to assess the actual intake at a group level, underestimation must be considered.

  19. Toward computer-assisted diagnosis and telemedicine in ophthalmology

    Czech Academy of Sciences Publication Activity Database

    Marrugo, A.; Millán, M. S.; Cristóbal, G.; Gabarda, S.; Šorel, Michal; Šroubek, Filip

    2012-01-01

    Roč. 2012, č. 6 (2012), s. 1-3 ISSN 1818-2259 R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : computer-aided diagnosis * medical and retinal image * deconvolution * telemedicine Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2012/ZOI/sorel-toward computer-assisted diagnosis and telemedicine in ophthalmology.pdf

  20. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  1. Computational Bench Testing to Evaluate the Short-Term Mechanical Performance of a Polymeric Stent.

    Science.gov (United States)

    Bobel, A C; Petisco, S; Sarasua, J R; Wang, W; McHugh, P E

    2015-12-01

    Over the last decade, there has been a significant volume of research focussed on the utilization of biodegradable polymers such as poly-L-lactide-acid (PLLA) for applications associated with cardiovascular disease. More specifically, there has been an emphasis on upgrading current clinical shortfalls experienced with conventional bare metal stents and drug eluting stents. One such approach, the adaption of fully formed polymeric stents has led to a small number of products being commercialized. Unfortunately, these products are still in their market infancy, meaning there is a clear non-occurrence of long term data which can support their mechanical performance in vivo. Moreover, the load carry capacity and other mechanical properties essential to a fully optimized polymeric stent are difficult, timely and costly to establish. With the aim of compiling rapid and representative performance data for specific stent geometries, materials and designs, in addition to reducing experimental timeframes, Computational bench testing via finite element analysis (FEA) offers itself as a very powerful tool. On this basis, the research presented in this paper is concentrated on the finite element simulation of the mechanical performance of PLLA, which is a fully biodegradable polymer, in the stent application, using a non-linear viscous material model. Three physical stent geometries, typically used for fully polymeric stents, are selected, and a comparative study is performed in relation to their short-term mechanical performance, with the aid of experimental data. From the simulated output results, an informed understanding can be established in relation to radial strength, flexibility and longitudinal resistance, that can be compared with conventional permanent metal stent functionality, and the results show that it is indeed possible to generate a PLLA stent with comparable and sufficient mechanical performance. The paper also demonstrates the attractiveness of FEA as a tool

  2. Plasticity of Neuron-Glial Transmission: Equipping Glia for Long-Term Integration of Network Activity

    Directory of Open Access Journals (Sweden)

    Wayne Croft

    2015-01-01

    Full Text Available The capacity of synaptic networks to express activity-dependent changes in strength and connectivity is essential for learning and memory processes. In recent years, glial cells (most notably astrocytes have been recognized as active participants in the modulation of synaptic transmission and synaptic plasticity, implicating these electrically nonexcitable cells in information processing in the brain. While the concept of bidirectional communication between neurons and glia and the mechanisms by which gliotransmission can modulate neuronal function are well established, less attention has been focussed on the computational potential of neuron-glial transmission itself. In particular, whether neuron-glial transmission is itself subject to activity-dependent plasticity and what the computational properties of such plasticity might be has not been explored in detail. In this review, we summarize current examples of plasticity in neuron-glial transmission, in many brain regions and neurotransmitter pathways. We argue that induction of glial plasticity typically requires repetitive neuronal firing over long time periods (minutes-hours rather than the short-lived, stereotyped trigger typical of canonical long-term potentiation. We speculate that this equips glia with a mechanism for monitoring average firing rates in the synaptic network, which is suited to the longer term roles proposed for astrocytes in neurophysiology.

  3. 2.1 Man: subject of protection

    International Nuclear Information System (INIS)

    2004-01-01

    This second chapter 'Man and environment' of the 7th state of the environment report of Austria describes the current situation of the protection of human health in terms of the European environmental policy and the main subjects of high relevance to it, such as air pollutants, water pollution, noise pollution, dangerous chemicals, food contamination, radiation protection, effects of climate change, plants, animal and habitats. (nevyjel)

  4. Short-Term Creep Behavior of CFRP-Reinforced Wood Composites Subjected to Cyclic Loading at Different Climate Conditions

    OpenAIRE

    Xiaojun Yang; Meng Gong; Ying Hei Chui

    2014-01-01

    Carbon fiber reinforced plastic (CFRP) was used to adhesively reinforce Chinese fir (Cunninghamia lanceolata) wood specimens. This study examined the flexural static and creep performances of CFPR-reinforced wood composites that had been subjected to changes in moisture and stress levels. The major findings were as follows: 1) the cyclic creep was slightly lower for those specimens subjected to the cyclic stress condition than for those subjected to a constant stress level due to the deflecti...

  5. Perception of Undergraduates about Computer and Internet Ethics ...

    African Journals Online (AJOL)

    Computer and internet has brought innovative changes in education all over the world. In the universities of Pakistan, computer and IT related courses have recently been included as compulsory subjects in all disciplines at undergraduate level. Therefore, it was important to know the perceptual understanding and ...

  6. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  7. Quantification of the right ventricular wall using stress myocardial emission computed tomography with thallium-201 in normal subjects

    International Nuclear Information System (INIS)

    Akanabe, Hiroshi; Oshima, Motoo; Sakuma, Sadayuki; Yamamoto, Shuhei; Kawai, Naoki; Sotobata, Iwao

    1985-01-01

    Although many studies of quantitative analysis of left ventricular myocardial wall (LVMW) have been reported using stress thallium-201 (Tl-201), few reports of right ventricular myocardial wall (RVMW) have been estimated. In this study we determined whether single photon emission computed tomography (SPECT) with Tl-201 could accurately define normal range of RVMW in normal subjects. Twelve persons who have no valvular disease, nor coronary artery disease were included in this study. Stress SPECT study was reconstructed to make a short axial images of ventricles. RVMW and LVMW were flagged by mamual. Each ventricle was divided into 36 parts at every 10 degree. Relative activity counts in each ventricle were calculated as a percent counts of maximum counts in left ventricle. The normal range of RVMW with stress SPECT was as follows: anterior wall (33.2 +- 11.4 %, mean +- 2 standard deviation, -62.7 +- 18.4 %), free wall (30.1 +- 12.4 % - 38.5 +- 8.8 %), inferior wall (40.4 +- 7.8 % - 60.0 +- 21.4 %), septal wall (65.2 +- 17.2 % - 71.1 +- 14.2 %). Above the results, SPECT with Tl-201 can accurately define the normal range of RVMW, and this method is usefull to quantify the degree of ischemia and hypertrophy in RVMW. (author)

  8. Blink rate, incomplete blinks and computer vision syndrome.

    Science.gov (United States)

    Portello, Joan K; Rosenfield, Mark; Chu, Christina A

    2013-05-01

    Computer vision syndrome (CVS), a highly prevalent condition, is frequently associated with dry eye disorders. Furthermore, a reduced blink rate has been observed during computer use. The present study examined whether post task ocular and visual symptoms are associated with either a decreased blink rate or a higher prevalence of incomplete blinks. An additional trial tested whether increasing the blink rate would reduce CVS symptoms. Subjects (N = 21) were required to perform a continuous 15-minute reading task on a desktop computer at a viewing distance of 50 cm. Subjects were videotaped during the task to determine their blink rate and amplitude. Immediately after the task, subjects completed a questionnaire regarding ocular symptoms experienced during the trial. In a second session, the blink rate was increased by means of an audible tone that sounded every 4 seconds, with subjects being instructed to blink on hearing the tone. The mean blink rate during the task without the audible tone was 11.6 blinks per minute (SD, 7.84). The percentage of blinks deemed incomplete for each subject ranged from 0.9 to 56.5%, with a mean of 16.1% (SD, 15.7). A significant positive correlation was observed between the total symptom score and the percentage of incomplete blinks during the task (p = 0.002). Furthermore, a significant negative correlation was noted between the blink score and symptoms (p = 0.035). Increasing the mean blink rate to 23.5 blinks per minute by means of the audible tone did not produce a significant change in the symptom score. Whereas CVS symptoms are associated with a reduced blink rate, the completeness of the blink may be equally significant. Because instructing a patient to increase his or her blink rate may be ineffective or impractical, actions to achieve complete corneal coverage during blinking may be more helpful in alleviating symptoms during computer operation.

  9. Green and sustainable computing pt.I

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  10. Decrements in Intrinsic Motivation among Rewarded and Observer Subjects.

    Science.gov (United States)

    Morgan, Mark

    1983-01-01

    Two studies examined the extent to which overjustification effects in five- and ten-year-old subjects can be explained in terms of expectations deriving from the offer of a reward by the experimenter. (Author/MP)

  11. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  12. Biomechanical effects of mobile computer location in a vehicle cab.

    Science.gov (United States)

    Saginus, Kyle A; Marklin, Richard W; Seeley, Patricia; Simoneau, Guy G; Freier, Stephen

    2011-10-01

    The objective of this research is to determine the best location to place a conventional mobile computer supported by a commercially available mount in a light truck cab. U.S. and Canadian electric utility companies are in the process of integrating mobile computers into their fleet vehicle cabs. There are no publications on the effect of mobile computer location in a vehicle cab on biomechanical loading, performance, and subjective assessment. The authors tested four locations of mobile computers in a light truck cab in a laboratory study to determine how location affected muscle activity of the lower back and shoulders; joint angles of the shoulders, elbows, and wrist; user performance; and subjective assessment. A total of 22 participants were tested in this study. Placing the mobile computer closer to the steering wheel reduced low back and shoulder muscle activity. Joint angles of the shoulders, elbows, and wrists were also closer to neutral angle. Biomechanical modeling revealed substantially less spinal compression and trunk muscle force. In general, there were no practical differences in performance between the locations. Subjective assessment indicated that users preferred the mobile computer to be as close as possible to the steering wheel. Locating the mobile computer close to the steering wheel reduces risk of injuries, such as low back pain and shoulder tendonitis. Results from the study can guide electric utility companies in the installation of mobile computers into vehicle cabs. Results may also be generalized to other industries that use trucklike vehicles, such as construction.

  13. The basics of cloud computing understanding the fundamentals of cloud computing in theory and practice

    CERN Document Server

    Rountree, Derrick

    2013-01-01

    As part of the Syngress Basics series, The Basics of Cloud Computing provides readers with an overview of the cloud and how to implement cloud computing in their organizations. Cloud computing continues to grow in popularity, and while many people hear the term and use it in conversation, many are confused by it or unaware of what it really means. This book helps readers understand what the cloud is and how to work with it, even if it isn't a part of their day-to-day responsibility. Authors Derrick Rountree and Ileana Castrillo explains the concepts of cloud computing in prac

  14. Subjective health complaints in relation to sickness absence

    NARCIS (Netherlands)

    Roelen, Corne A. M.; Koopmans, Petra C.; Groothoff, Johan W.

    2010-01-01

    Objective: The Dutch population is healthy in terms of living and working conditions, but the levels of subjective health complaints (SHC) and sickness absence are high in the Dutch workforce. Are SHC related to sickness absence? Participants: The study population included the personnel of four

  15. Multiaxis, Lightweight, Computer-Controlled Exercise System

    Science.gov (United States)

    Haynes, Leonard; Bachrach, Benjamin; Harvey, William

    2006-01-01

    The multipurpose, multiaxial, isokinetic dynamometer (MMID) is a computer-controlled system of exercise machinery that can serve as a means for quantitatively assessing a subject s muscle coordination, range of motion, strength, and overall physical condition with respect to a wide variety of forces, motions, and exercise regimens. The MMID is easily reconfigurable and compactly stowable and, in comparison with prior computer-controlled exercise systems, it weighs less, costs less, and offers more capabilities. Whereas a typical prior isokinetic exercise machine is limited to operation in only one plane, the MMID can operate along any path. In addition, the MMID is not limited to the isokinetic (constant-speed) mode of operation. The MMID provides for control and/or measurement of position, force, and/or speed of exertion in as many as six degrees of freedom simultaneously; hence, it can accommodate more complex, more nearly natural combinations of motions and, in so doing, offers greater capabilities for physical conditioning and evaluation. The MMID (see figure) includes as many as eight active modules, each of which can be anchored to a floor, wall, ceiling, or other fixed object. A cable is payed out from a reel in each module to a bar or other suitable object that is gripped and manipulated by the subject. The reel is driven by a DC brushless motor or other suitable electric motor via a gear reduction unit. The motor can be made to function as either a driver or an electromagnetic brake, depending on the required nature of the interaction with the subject. The module includes a force and a displacement sensor for real-time monitoring of the tension in and displacement of the cable, respectively. In response to commands from a control computer, the motor can be operated to generate a required tension in the cable, to displace the cable a required distance, or to reel the cable in or out at a required speed. The computer can be programmed, either locally or via

  16. Subjective versus objective risk in genetic counseling for hereditary breast and/or ovarian cancers

    Directory of Open Access Journals (Sweden)

    Sperduti Isabella

    2009-12-01

    Full Text Available Abstract Background Despite the fact that genetic counseling in oncology provides information regarding objective risks, it can be found a contrast between the subjective and objective risk. The aims of this study were to evaluate the accuracy of the perceived risk compared to the objective risk estimated by the BRCApro computer model and to evaluate any associations between medical, demographic and psychological variables and the accuracy of risk perception. Methods 130 subjects were given medical-demographic file, Cancer and Genetic Risk Perception, Hospital Anxiety-Depression Scale. It was also computed an objective evaluation of the risk by the BRCApro model. Results The subjective risk was significantly higher than objective risk. The risk of tumour was overestimated by 56%, and the genetic risk by 67%. The subjects with less cancer affected relatives significantly overestimated their risk of being mutation carriers and made a more innacurate estimation than high risk subjects. Conclusion The description of this sample shows: general overestimation of the risk, inaccurate perception compared to BRCApro calculation and a more accurate estimation in those subjects with more cancer affected relatives (high risk subjects. No correlation was found between the levels of perception of risk and anxiety and depression. Based on our findings, it is worth pursuing improved communication strategies about the actual cancer and genetic risk, especially for subjects at "intermediate and slightly increased risk" of developing an hereditary breast and/or ovarian cancer or of being mutation carrier.

  17. Portable Brain-Computer Interface for the Intensive Care Unit Patient Communication Using Subject-Dependent SSVEP Identification.

    Science.gov (United States)

    Dehzangi, Omid; Farooq, Muhamed

    2018-01-01

    A major predicament for Intensive Care Unit (ICU) patients is inconsistent and ineffective communication means. Patients rated most communication sessions as difficult and unsuccessful. This, in turn, can cause distress, unrecognized pain, anxiety, and fear. As such, we designed a portable BCI system for ICU communications (BCI4ICU) optimized to operate effectively in an ICU environment. The system utilizes a wearable EEG cap coupled with an Android app designed on a mobile device that serves as visual stimuli and data processing module. Furthermore, to overcome the challenges that BCI systems face today in real-world scenarios, we propose a novel subject-specific Gaussian Mixture Model- (GMM-) based training and adaptation algorithm. First, we incorporate subject-specific information in the training phase of the SSVEP identification model using GMM-based training and adaptation. We evaluate subject-specific models against other subjects. Subsequently, from the GMM discriminative scores, we generate the transformed vectors, which are passed to our predictive model. Finally, the adapted mixture mean scores of the subject-specific GMMs are utilized to generate the high-dimensional supervectors. Our experimental results demonstrate that the proposed system achieved 98.7% average identification accuracy, which is promising in order to provide effective and consistent communication for patients in the intensive care.

  18. Make Computer Learning Stick.

    Science.gov (United States)

    Casella, Vicki

    1985-01-01

    Teachers are using computer programs in conjunction with many classroom staples such as art supplies, math manipulatives, and science reference books. Twelve software programs and related activities are described which teach visual and auditory memory and spatial relations, as well as subject areas such as anatomy and geography. (MT)

  19. Practical quantum computing on encrypted data

    OpenAIRE

    Marshall, Kevin; Jacobsen, Christian S.; Schafermeier, Clemens; Gehring, Tobias; Weedbrook, Christian; Andersen, Ulrik L.

    2016-01-01

    The ability to perform computations on encrypted data is a powerful tool for protecting a client's privacy, especially in today's era of cloud and distributed computing. In terms of privacy, the best solutions that classical techniques can achieve are unfortunately not unconditionally secure in the sense that they are dependent on a hacker's computational power. Here we theoretically investigate, and experimentally demonstrate with Gaussian displacement and squeezing operations, a quantum sol...

  20. Computing NLTE Opacities -- Node Level Parallel Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Holladay, Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-11

    Presentation. The goal: to produce a robust library capable of computing reasonably accurate opacities inline with the assumption of LTE relaxed (non-LTE). Near term: demonstrate acceleration of non-LTE opacity computation. Far term (if funded): connect to application codes with in-line capability and compute opacities. Study science problems. Use efficient algorithms that expose many levels of parallelism and utilize good memory access patterns for use on advanced architectures. Portability to multiple types of hardware including multicore processors, manycore processors such as KNL, GPUs, etc. Easily coupled to radiation hydrodynamics and thermal radiative transfer codes.

  1. Brain computer interface for operating a robot

    Science.gov (United States)

    Nisar, Humaira; Balasubramaniam, Hari Chand; Malik, Aamir Saeed

    2013-10-01

    A Brain-Computer Interface (BCI) is a hardware/software based system that translates the Electroencephalogram (EEG) signals produced by the brain activity to control computers and other external devices. In this paper, we will present a non-invasive BCI system that reads the EEG signals from a trained brain activity using a neuro-signal acquisition headset and translates it into computer readable form; to control the motion of a robot. The robot performs the actions that are instructed to it in real time. We have used the cognitive states like Push, Pull to control the motion of the robot. The sensitivity and specificity of the system is above 90 percent. Subjective results show a mixed trend of the difficulty level of the training activities. The quantitative EEG data analysis complements the subjective results. This technology may become very useful for the rehabilitation of disabled and elderly people.

  2. Short term and medium term power distribution load forecasting by neural networks

    International Nuclear Information System (INIS)

    Yalcinoz, T.; Eminoglu, U.

    2005-01-01

    Load forecasting is an important subject for power distribution systems and has been studied from different points of view. In general, load forecasts should be performed over a broad spectrum of time intervals, which could be classified into short term, medium term and long term forecasts. Several research groups have proposed various techniques for either short term load forecasting or medium term load forecasting or long term load forecasting. This paper presents a neural network (NN) model for short term peak load forecasting, short term total load forecasting and medium term monthly load forecasting in power distribution systems. The NN is used to learn the relationships among past, current and future temperatures and loads. The neural network was trained to recognize the peak load of the day, total load of the day and monthly electricity consumption. The suitability of the proposed approach is illustrated through an application to real load shapes from the Turkish Electricity Distribution Corporation (TEDAS) in Nigde. The data represents the daily and monthly electricity consumption in Nigde, Turkey

  3. Quantum simulations with noisy quantum computers

    Science.gov (United States)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  4. 1992 CERN school of computing

    International Nuclear Information System (INIS)

    Verkerk, C.

    1993-01-01

    These Proceedings contain written accounts of most of the lectures given at the 1992 CERN School of Computing, covering a variety of topics. A number of aspects of parallel and of distributed computing were treated in five lecture series: 'Status of parallel computing', 'An introduction to the APE100 computer', 'Introduction to distributed systems', 'Interprocess communication' and 'SHIFT, heterogeneous workstation services at CERN'. Triggering and data acquisition for future colliders was covered in: 'Neural networks for tripper' and 'Architecture for future data acquisition systems'. Analysis of experiments was treated in two series of lectures; 'Off-line software in HEP: Experience and trends', and 'Is there a future for event display?'. Design techniques were the subject of lectures on: 'Computer-aided design of electronics', CADD, computer-aided detector design' and 'Software design, the methods and the tools'. The other lectures reproduced here treated various fields: 'Second generation expert systems', 'Multidatabase in health care systems', 'Multimedia networks, what is new?' 'Pandora: An experimental distributed multimedia system', 'Benchmarking computers for HEP', 'Experience with some early computers' and 'Turing and ACE; lessons from a 1946 computer design'. (orig.)

  5. Computational and experimental investigation of free vibration and flutter of bridge decks

    Science.gov (United States)

    Helgedagsrud, Tore A.; Bazilevs, Yuri; Mathisen, Kjell M.; Øiseth, Ole A.

    2018-06-01

    A modified rigid-object formulation is developed, and employed as part of the fluid-object interaction modeling framework from Akkerman et al. (J Appl Mech 79(1):010905, 2012. https://doi.org/10.1115/1.4005072) to simulate free vibration and flutter of long-span bridges subjected to strong winds. To validate the numerical methodology, companion wind tunnel experiments have been conducted. The results show that the computational framework captures very precisely the aeroelastic behavior in terms of aerodynamic stiffness, damping and flutter characteristics. Considering its relative simplicity and accuracy, we conclude from our study that the proposed free-vibration simulation technique is a valuable tool in engineering design of long-span bridges.

  6. Timeliness of Creative Subjects in Architecture Education

    Science.gov (United States)

    Vargot, T.

    2017-11-01

    The following article is about the problem of insufficient number of drawing and painting lessons delivered in the process of architectural education. There is a comparison between the education of successful architects of the past and modern times. The author stands for the importance of creative subjects being the essential part of development and education of future architects. Skills achieved during the study of creative subjects will be used not only as a mean of self-expression but as an instrument in the toolkit of a professional. Sergei Tchoban was taken as an example of a successful architect for whom the knowledge of a man-made drawing is very important. He arranges the contests of architectural drawings for students promoting creative development in this way. Nowadays, students tend to use computer programs to make architectural projects losing their individual approach. The creative process becomes a matter of scissors and paste being just a copy of something that already exists. The solution of the problem is the reconsideration of the department’s curriculum and adding extra hours for creative subjects.

  7. Subjective Experiences of Space and Time: Self, Sensation, and Phenomenal Time

    OpenAIRE

    Ram Lakhan Pandey Vimal

    2008-01-01

    The investigation of subjective experiences (SEs) of space and time is at the core of consciousness research. The term ‘space’ includes the subject and objects. The SE of subject, I-ness, is defined as ‘Self’. The SEs of objects, subject’s external body, and subject’s internal states such as feelings, thoughts, and so on can be investigated using the proto-experience (PE)-SE framework. The SE of time is defined as ‘phenomenal time’ (...

  8. On the long-term analysis with finite elements

    International Nuclear Information System (INIS)

    Argyris, J.H.; Szimmat, J.; Willam, K.J.

    1975-01-01

    Following a presentation of concrete creep, a brief summary of the direct and incremental calculation methods on long-term behaviour is given. This is followed by a survey of the method of the inner state variables, which on the one hand gives a uniform framework for the various formulations of concrete creep, and on the other hand leads to a computer-ready calculation process. Two examples on long-term behaviour illustrate the regions of application of the computer methods. (orig./LH) [de

  9. Elevated-temperature benchmark tests of simply supported beams and circular plates subjected to time-varying loadings

    International Nuclear Information System (INIS)

    Corum, J.M.; Richardson, M.; Clinard, J.A.

    1977-01-01

    This report presents the measured elastic-plastic-creep responses of eight simply supported type 304 stainless steel beams and circular plates that were subjected to time-varying loadings at elevated temperature. The tests were performed to provide experimental benchmark problem data suitable for assessing inelastic analysis methods and for validating computer programs. Beams and plates exhibit the essential features of inelastic structural behavior; yet they are relatively simple and the experimental results are generally easy to interpret. The stress fields are largely uniaxial in beams, while multiaxial effects are introduced in plates. The specimens tested were laterally loaded at the center and subjected to either a prescribed load or a center deflection history. The specimens were machined from a common well-characterized heat of material, and all the tests were performed at a temperature of 593 0 C (1100 0 F). Test results are presented in terms of the load and center deflection behaviors, which typify the overall structural behavior. Additional deflection data, as well as strain gage results and mechanical properties data for the beam and plate material, are provided in the appendices

  10. Term Satisfiability in FLew-Algebras

    Czech Academy of Sciences Publication Activity Database

    Haniková, Zuzana; Savický, Petr

    2016-01-01

    Roč. 631, 6 June (2016), s. 1-15 ISSN 0304-3975 R&D Projects: GA ČR GBP202/12/G061 Institutional support: RVO:67985807 Keywords : substructural logic * FLew-algebra * MV-algebra * satisfiability * computational complexity Subject RIV: BA - General Mathematics Impact factor: 0.698, year: 2016

  11. Evaluation of subjective image quality in relation to diagnostic task for cone beam computed tomography with different fields of view.

    Science.gov (United States)

    Lofthag-Hansen, Sara; Thilander-Klang, Anne; Gröndahl, Kerstin

    2011-11-01

    To evaluate subjective image quality for two diagnostic tasks, periapical diagnosis and implant planning, for cone beam computed tomography (CBCT) using different exposure parameters and fields of view (FOVs). Examinations were performed in posterior part of the jaws on a skull phantom with 3D Accuitomo (FOV 3 cm×4 cm) and 3D Accuitomo FPD (FOVs 4 cm×4 cm and 6 cm×6 cm). All combinations of 60, 65, 70, 75, 80 kV and 2, 4, 6, 8, 10 mA with a rotation of 180° and 360° were used. Dose-area product (DAP) value was determined for each combination. The images were presented, displaying the object in axial, cross-sectional and sagittal views, without scanning data in a random order for each FOV and jaw. Seven observers assessed image quality on a six-point rating scale. Intra-observer agreement was good (κw=0.76) and inter-observer agreement moderate (κw=0.52). Stepwise logistic regression showed kV, mA and diagnostic task to be the most important variables. Periapical diagnosis, regardless jaw, required higher exposure parameters compared to implant planning. Implant planning in the lower jaw required higher exposure parameters compared to upper jaw. Overall ranking of FOVs gave 4 cm×4 cm, 6 cm×6 cm followed by 3 cm×4 cm. This study has shown that exposure parameters should be adjusted according to diagnostic task. For this particular CBCT brand a rotation of 180° gave good subjective image quality, hence a substantial dose reduction can be achieved without loss of diagnostic information. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Evaluation of subjective image quality in relation to diagnostic task for cone beam computed tomography with different fields of view

    International Nuclear Information System (INIS)

    Lofthag-Hansen, Sara; Thilander-Klang, Anne; Groendahl, Kerstin

    2011-01-01

    Aims: To evaluate subjective image quality for two diagnostic tasks, periapical diagnosis and implant planning, for cone beam computed tomography (CBCT) using different exposure parameters and fields of view (FOVs). Materials and methods: Examinations were performed in posterior part of the jaws on a skull phantom with 3D Accuitomo (FOV 3 cm x 4 cm) and 3D Accuitomo FPD (FOVs 4 cm x 4 cm and 6 cm x 6 cm). All combinations of 60, 65, 70, 75, 80 kV and 2, 4, 6, 8, 10 mA with a rotation of 180 o and 360 o were used. Dose-area product (DAP) value was determined for each combination. The images were presented, displaying the object in axial, cross-sectional and sagittal views, without scanning data in a random order for each FOV and jaw. Seven observers assessed image quality on a six-point rating scale. Results: Intra-observer agreement was good (κ w = 0.76) and inter-observer agreement moderate (κ w = 0.52). Stepwise logistic regression showed kV, mA and diagnostic task to be the most important variables. Periapical diagnosis, regardless jaw, required higher exposure parameters compared to implant planning. Implant planning in the lower jaw required higher exposure parameters compared to upper jaw. Overall ranking of FOVs gave 4 cm x 4 cm, 6 cm x 6 cm followed by 3 cm x 4 cm. Conclusions: This study has shown that exposure parameters should be adjusted according to diagnostic task. For this particular CBCT brand a rotation of 180 o gave good subjective image quality, hence a substantial dose reduction can be achieved without loss of diagnostic information.

  13. Short Term Cyber Attacks with Long Term Effects and Degradation of Supply Chain Capability

    Science.gov (United States)

    2016-09-01

    Artificial Intelligence Research Society Conference, 271–275, St. Augustine: Florida. Goetschalckx, Marc. 2011. Supply Chain Engineering. New York: Springer...term risks in a network supply chain to establish the existence of black swan events. 14. SUBJECT TERMS cybersecurity , supply chain risk...Mission, and Information System View (NIST SP 800–39) .....50 6. Cybersecurity Instruction for the DOD (DODI 8500.01) .........51 7. Risk Management

  14. Variance stabilization for computing and comparing grand mean waveforms in MEG and EEG.

    Science.gov (United States)

    Matysiak, Artur; Kordecki, Wojciech; Sielużycki, Cezary; Zacharias, Norman; Heil, Peter; König, Reinhard

    2013-07-01

    Grand means of time-varying signals (waveforms) across subjects in magnetoencephalography (MEG) and electroencephalography (EEG) are commonly computed as arithmetic averages and compared between conditions, for example, by subtraction. However, the prerequisite for these operations, homogeneity of the variance of the waveforms in time, and for most common parametric statistical tests also between conditions, is rarely met. We suggest that the heteroscedasticity observed instead results because waveforms may differ by factors and additive terms and follow a mixed model. We propose to apply the asinh-transformation to stabilize the variance in such cases. We demonstrate the homogeneous variance and the normal distributions of data achieved by this transformation using simulated waveforms, and we apply it to real MEG data and show its benefits. The asinh-transformation is thus an essential and useful processing step prior to computing and comparing grand mean waveforms in MEG and EEG. Copyright © 2013 Society for Psychophysiological Research.

  15. THEORETICAL AND EXPERIMENTAL STUDY OF STRUCTURES SUBJECTED TO EARTHQUAKES

    Energy Technology Data Exchange (ETDEWEB)

    Soubirou, A.

    1967-12-31

    The object of the study was the investigation of the behaviour of structures subject to earthquakes. After .describing and analysing seismic movements, useful concepts for earthquake-proofing structures are lintroduced. Then, the dynamic behaviour of systems with n degrees of freedom was studied in order to evolve the theoretical computation of seismic behaviour, a typical application being reticulated structures. The next stage was showing the computational procedure for seismic spectra and the natural frequencies of buildings, an attempt being made to define earthquake-proofing criteria for a special type of reinforced-concrete construction. . The last matter dealt with is elastoplastic behaviour of structures, a study of increasingly growing importance.

  16. The Cardiovascular Function Profile and Physical Fitness in Overweight Subjects

    Science.gov (United States)

    Megawati, E. R.; Lubis, L. D.; Harahap, F. Y.

    2017-03-01

    Obesity in children and young adult is associated with cardiovascular risk in short term and long term. The aim of this study was to describe the profile of the cardiovascular functions parameters and physical fitness in overweight. This is an analytical observational study with cross sectional approach. The samples of this study were 85 randomly selected subjects aged 18 to 24 years with normoweight and body mass index <40. The parameters measures were body mass index (BMI), waist circumference (WC), waist-hip ratio (WHR), cardiovascular function parameters (resting pulse, blood pressure, and peak flow meter) and physical fitness parameters (VO2max dengan McArdle step test). The mean BMI was 24,53±4,929. The WC and WHR mean were 86,7±14,10 cms and 0,89±0,073 cm respectively. The mean of resting pulses were higher in normoweight subject (p=0,0209). The mean systole were lower in normoweight subject (p=0,0026). No differences VO2 max between groups (p=0,3888). The peak flow meter was higher in normoweight (p=0,0274). The result of this study indicate that heart rate, systole and peak flow meter are signifantly different between groups. The heart rate and the peak flow meter in the overweight subjects were lower meanwhile the systole blood pressure was higher compared to normoweight subjects.

  17. Ultrasonic nonlinearity of AISI316 austenitic steel subjected to long-term isothermal aging

    Energy Technology Data Exchange (ETDEWEB)

    Gong, Won Sik; Kim, Chung Seok [Dept. of Materials Science and Engineering, Chosun University, Gwangju (Korea, Republic of)

    2014-06-15

    This study presents the ultrasonic nonlinearity of AISI316 austenitic stainless steels subjected to longterm isothermal aging. These steels are attractive materials for use in industrial mechanical structures because of their strength at high-temperatures and their chemical stability. The test materials were subjected to accelerated heat-treatment in an electrical furnace for a predetermined aging duration. The variations in the ultrasonic nonlinearity and microstructural damage were carefully evaluated through observation of the microstructure. The ultrasonic nonlinearity stiffly dropped after aging for up to 1000 h and, then, monotonously decreased. The polygonal shape of the initial grain structures changed to circular, especially as the annealing twins in the grains dissolved and disappeared. The delta ferrite on the grain boundaries could not be observed at 1000 h of aging, and these continuously transformed into their sigma phases. Consequently, in the intial aging period, the rapid decrease in the ultrasonic nonlinearity was caused by voids, dislocations, and twin annihilation. The continuous monotonic decrease in the ultrasonic nonlinearity after the first drop resulted from the generation of Cr{sub 23}C{sub 6} precipitates and σ phases.

  18. Finite element electromagnetic field computation on the Sequent Symmetry 81 parallel computer

    International Nuclear Information System (INIS)

    Ratnajeevan, S.; Hoole, H.

    1990-01-01

    Finite element field analysis algorithms lend themselves to parallelization and this fact is exploited in this paper to implement a finite element analysis program for electromagnetic field computation on the Sequent Symmetry 81 parallel computer with three processors. In terms of waiting time, the maximum gains are to be made in matrix solution and therefore this paper concentrates on the gains in parallelizing the solution part of finite element analysis. An outline of how parallelization could be exploited in most finite element operations is given in this paper although the actual implemention of parallelism on the Sequent Symmetry 81 parallel computer was in sparsity computation, matrix assembly and the matrix solution areas. In all cases, the algorithms were modified suit the parallel programming application rather than allowing the compiler to parallelize on existing algorithms

  19. Polyakov's quantized string with boundary terms

    International Nuclear Information System (INIS)

    Durhuus, B.; Olesen, P.; Petersen, J.L.

    1981-11-01

    The authors compute the boundary terms needed in Polyakov's method for calculating averages of functionals defined on surfaces. The method used is due to Seeley, who found recursive relations yielding the boundary terms. These relations are solved for a general second order elliptic differential operator. This solution is then applied to Polyakov's problem. (Auth.)

  20. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  1. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  2. 32 CFR 268.9 - Discussion of terms.

    Science.gov (United States)

    2010-07-01

    ... arrearages of principal and interest. They shall be computed at the reporting rate prescribed by Treasury... principal payments or interest on short-term and long-term loans and credits. (b) Arrearage delinquency... or convention between sovereign states. (d) Dollar equivalents of foreign currency. Represents dollar...

  3. Graphic recording of heart sounds in height native subjects

    OpenAIRE

    Rotta, Andrés; Ascenzo C., Jorge

    2014-01-01

    The phonocardiograms series obtained from normal subjects show that it is not always possible to record the noises Headset and 3rd, giving diverse enrollment rates by different authors. The reason why the graphic registration fails these noises largely normal individuals has not yet been explained in concrete terms, but allowed different influencing factors such as age, determinants of noises, terms of transmissibility chest wall sensitivity of the recording apparatus, etc. Los fonocardiog...

  4. Marijuana effects on long-term memory assessment and retrieval.

    Science.gov (United States)

    Darley, C F; Tinklenberg, J R; Roth, W T; Vernon, S; Kopell, B S

    1977-05-09

    The ability of 16 college-educated male subjects to recall from long-term memory a series of common facts was tested during intoxication with marijuana extract calibrated to 0.3 mg/kg delta-9-tetrahydrocannabinol and during placebo conditions. The subjects' ability to assess their memory capabilities was then determined by measuring how certain they were about the accuracy of their recall performance and by having them predict their performance on a subsequent recognition test involving the same recall items. Marijuana had no effect on recall or recognition performance. These results do not support the view that marijuana provides access to facts in long-term storage which are inaccessible during non-intoxication. During both marijuana and placebo conditions, subjects could accurately predict their recognition memory performance. Hence, marijuana did not alter the subjects' ability to accurately assess what information resides in long-term memory even though they did not have complete access to that information.

  5. Computational Thinking in K-9 Education

    NARCIS (Netherlands)

    Mannila, Linda; Dagiene, Valentina; Demo, Barbara; Grgurina, Natasa; Mirolo, Claudio; Rolandsson, Lennart; Settle, Amber

    2014-01-01

    In this report we consider the current status of the coverage of computer science in education at the lowest levels of education in multiple countries. Our focus is on computational thinking (CT), a term meant to encompass a set of concepts and thought processes that aid in formulating problems and

  6. Cervical vertebral and dental maturity in Turkish subjects.

    Science.gov (United States)

    Başaran, Güvenç; Ozer, Törün; Hamamci, Nihal

    2007-04-01

    The aim of this study was to investigate the relationships between the stages of calcification of teeth and the cervical vertebral maturity stages in Turkish subjects. A retrospective cross-sectional study was designed. The final study population consisted of 590 Turkish subjects. Statistical analysis of the data was performed with computer software. Spearman rank order correlation coefficients were used to assess the relationship between cervical vertebral and dental maturation. For a better understanding of the relationship between cervical vertebral maturation indexes and dental age, percentage distributions of the studied teeth were also calculated. Strict correlations were found between dental and cervical vertebral maturation of Turkish subjects. For males, the sequence from lowest to the highest was third molar, central incisor, canine, first premolar, second premolar, first molar, and second molar. For females, the sequence from lowest to the highest was third molar, canine, second premolar, first premolar, central incisor, first molar, and second molar. Dental maturation stages can be used as a reliable indicator of facial growth.

  7. Determination of regional cerebral blood flow curves and parameters by computed γ camera

    International Nuclear Information System (INIS)

    Zhu Guohong

    1988-01-01

    Regional CBF curves and parameters were determined in 236 subjects by Sigma 438/MCS 560 computed γ camera. Each subject was given 99m TcO 4 -370 MBq intravenously. Four CBF curves and three parameters were derived by the computer.The results from 39 normal subjects, 22 patients with cerebral embolism, 53 patients with cerebrovascular sclerosis, 56 patients with diseases of cervical vertebrae, 10 patients with concussion and 5 patients with cerebral arteritis were analyzed

  8. Short-Term Intra-Subject Variation in Exhaled Volatile Organic Compounds (VOCs in COPD Patients and Healthy Controls and Its Effect on Disease Classification

    Directory of Open Access Journals (Sweden)

    Christopher Phillips

    2014-05-01

    Full Text Available Exhaled volatile organic compounds (VOCs are of interest for their potential to diagnose disease non-invasively. However, most breath VOC studies have analyzed single breath samples from an individual and assumed them to be wholly consistent representative of the person. This provided the motivation for an investigation of the variability of breath profiles when three breath samples are taken over a short time period (two minute intervals between samples for 118 stable patients with Chronic Obstructive Pulmonary Disease (COPD and 63 healthy controls and analyzed by gas chromatography and mass spectroscopy (GC/MS. The extent of the variation in VOC levels differed between COPD and healthy subjects and the patterns of variation differed for isoprene versus the bulk of other VOCs. In addition, machine learning approaches were applied to the breath data to establish whether these samples differed in their ability to discriminate COPD from healthy states and whether aggregation of multiple samples, into single data sets, could offer improved discrimination. The three breath samples gave similar classification accuracy to one another when evaluated separately (66.5% to 68.3% subjects classified correctly depending on the breath repetition used. Combining multiple breath samples into single data sets gave better discrimination (73.4% subjects classified correctly. Although accuracy is not sufficient for COPD diagnosis in a clinical setting, enhanced sampling and analysis may improve accuracy further. Variability in samples, and short-term effects of practice or exertion, need to be considered in any breath testing program to improve reliability and optimize discrimination.

  9. Methods and apparatuses for information analysis on shared and distributed computing systems

    Science.gov (United States)

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  10. Meaningless terms in rewriting

    NARCIS (Netherlands)

    Kennaway, R.; Oostrom, V. van; Vries, F.-J. de

    We present an axiomatic approach to the concept of meaninglessness in finite and transfinite term rewriting and lambda calculus. We justify our axioms in several ways. They can be intuitively justified from the viewpoint of rewriting as computation. They are shown to imply important properties

  11. Analysis of iterative region-of-interest image reconstruction for x-ray computed tomography

    Science.gov (United States)

    Sidky, Emil Y.; Kraemer, David N.; Roth, Erin G.; Ullberg, Christer; Reiser, Ingrid S.; Pan, Xiaochuan

    2014-01-01

    Abstract. One of the challenges for iterative image reconstruction (IIR) is that such algorithms solve an imaging model implicitly, requiring a complete representation of the scanned subject within the viewing domain of the scanner. This requirement can place a prohibitively high computational burden for IIR applied to x-ray computed tomography (CT), especially when high-resolution tomographic volumes are required. In this work, we aim to develop an IIR algorithm for direct region-of-interest (ROI) image reconstruction. The proposed class of IIR algorithms is based on an optimization problem that incorporates a data fidelity term, which compares a derivative of the estimated data with the available projection data. In order to characterize this optimization problem, we apply it to computer-simulated two-dimensional fan-beam CT data, using both ideal noiseless data and realistic data containing a level of noise comparable to that of the breast CT application. The proposed method is demonstrated for both complete field-of-view and ROI imaging. To demonstrate the potential utility of the proposed ROI imaging method, it is applied to actual CT scanner data. PMID:25685824

  12. Statistics of the Von Mises Stress Response For Structures Subjected To Random Excitations

    Directory of Open Access Journals (Sweden)

    Mu-Tsang Chen

    1998-01-01

    Full Text Available Finite element-based random vibration analysis is increasingly used in computer aided engineering software for computing statistics (e.g., root-mean-square value of structural responses such as displacements, stresses and strains. However, these statistics can often be computed only for Cartesian responses. For the design of metal structures, a failure criterion based on an equivalent stress response, commonly known as the von Mises stress, is more appropriate and often used. This paper presents an approach for computing the statistics of the von Mises stress response for structures subjected to random excitations. Random vibration analysis is first performed to compute covariance matrices of Cartesian stress responses. Monte Carlo simulation is then used to perform scatter and failure analyses using the von Mises stress response.

  13. Short-term efficacy of calcium fructoborate on subjects with knee discomfort: a comparative, double-blind, placebo-controlled clinical study

    Directory of Open Access Journals (Sweden)

    Pietrzkowski Z

    2014-06-01

    Full Text Available Zbigniew Pietrzkowski,1 Michael J Phelan,2 Robert Keller,3 Cynthia Shu,1 Ruby Argumedo,1 Tania Reyes-Izquierdo11FutureCeuticals, Inc., Applied BioClinical Laboratory; 2Department of Statistics, School of Information and Computer Science, University of California at Irvine; 3NutraClinical Inc., Irvine, CA, USAAbstract: Calcium fructoborate (CFB at a dose of 110 mg twice per day was previously reported to improve knee discomfort during the first 14 days of treatment. In this study, 60 participants with self-reported knee discomfort were randomized into two groups receiving CFB or placebo. Initial levels of knee discomfort were evaluated by Western Ontario and McMaster Universities Arthritis Index (WOMAC and McGill Pain Questionnaire (MPQ scores at the beginning of the study and also at 7 and 14 days after treatment. Results showed that supplementation with CFB significantly improved knee discomfort in the study subjects; significant reductions of mean within-subject change in WOMAC and MPQ scores were observed for the CFB group compared to the placebo group at both 7 and 14 days after treatment. Estimated treatment differences for the MPQ score were -5.8 (P=0.0009 and -8.9 (P<0.0001 at Day 7 and 14, respectively. Estimated differences for the WOMAC score were -5.3 (P=0.06 and -13.73 (P<0.0001 at Day 7 and 14, respectively. Negative values indicate greater reductions in reported discomfort. On both Day 7 and Day 14, the trend was toward greater improvement in the CFB group. The placebo group did not exhibit any change in the WOMAC and MPQ scores. In conclusion, supplementation with 110 mg CFB twice per day was associated with improving knee discomfort during the 2 weeks of intake.Keywords: CFB, joint discomfort, WOMAC score, McGill pain score

  14. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  15. Legal Aspects of Brain-Computer Interfaces

    Czech Academy of Sciences Publication Activity Database

    Krausová, Alžběta

    2014-01-01

    Roč. 8, č. 2 (2014) ISSN 1802-5951 Institutional support: RVO:68378122 Keywords : brain-computer interface * human rights * right to privacy, Subject RIV: AG - Legal Sciences http://mujlt.law.muni.cz/index.php

  16. Isolation of the ocular surface to treat dysfunctional tear syndrome associated with computer use.

    Science.gov (United States)

    Yee, Richard W; Sperling, Harry G; Kattek, Ashballa; Paukert, Martin T; Dawson, Kevin; Garcia, Marcie; Hilsenbeck, Susan

    2007-10-01

    Dysfunctional tear syndrome (DTS) associated with computer use is characterized by mild irritation, itching, redness, and intermittent tearing after extended staring. It frequently involves foreign body or sandy sensation, blurring of vision, and fatigue, worsening especially at the end of the day. We undertook a study to determine the effectiveness of periocular isolation using microenvironment glasses (MEGS) alone and in combination with artificial tears in alleviating the symptoms and signs of dry eye related to computer use. At the same time, we evaluated the relative ability of a battery of clinical tests for dry eye to distinguish dry eyes from normal eyes in heavy computer users. Forty adult subjects who used computers 3 hours or more per day were divided into dry eye sufferers and controls based on their scores on the Ocular Surface Disease Index (OSDI). Baseline scores were recorded and ocular surface assessments were made. On four subsequent visits, the subjects played a computer game for 30 minutes in a controlled environment, during which one of four treatment conditions were applied, in random order, to each subject: 1) no treatment, 2) artificial tears, 3) MEGS, and 4) artificial tears combined with MEGS. Immediately after each session, subjects were tested on: a subjective comfort questionnaire, tear breakup time (TBUT), fluorescein staining, lissamine green staining, and conjunctival injection. In this study, a significant correlation was found between cumulative lifetime computer use and ocular surface disorder, as measured by the standardized OSDI index. The experimental and control subjects were significantly different (P0.05. Isolation of the ocular surface alone produced significant improvements in comfort scores and TBUT and a consistent trend of improvement in fluorescein staining and lissamine green staining. Isolation plus tears produced a significant improvement in lissamine green staining. The subjective comfort inventory and the TBUT

  17. Proceedings of seventh symposium on sharing of computer programs and technology in nuclear medicine, computer assisted data processing

    International Nuclear Information System (INIS)

    Howard, B.Y.; McClain, W.J.; Landay, M.

    1977-01-01

    The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included

  18. Proceedings of seventh symposium on sharing of computer programs and technology in nuclear medicine, computer assisted data processing

    Energy Technology Data Exchange (ETDEWEB)

    Howard, B.Y.; McClain, W.J.; Landay, M. (comps.)

    1977-01-01

    The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included.

  19. Sensibility and Subjectivity: Levinas’ Traumatic Subject

    Directory of Open Access Journals (Sweden)

    Rashmika Pandya

    2011-02-01

    Full Text Available The importance of Levinas’ notions of sensibility and subjectivity are evident in the revision of phenomenological method by current phenomenologists such as Jean-Luc Marion and Michel Henry. The criticisms of key tenants of classical phenomenology, intentionality and reduction, are of a particular note. However, there are problems with Levinas’ characterization of subjectivity as essentially sensible. In “Totality and Infinity” and “Otherwise than Being”, Levinas criticizes and recasts a traditional notion of subjectivity, particularly the notion of the subject as the first and foremost rational subject. The subject in Levinas’ works is characterized more by its sensibility and affectedness than by its capacity to reason or affect its world. Levinas ties rationality to economy and suggests an alternative notion of reason that leads to his analysis of the ethical relation as the face-to-face encounter. The ‘origin’ of the social relation is located not in our capacity to know but rather in a sensibility that is diametrically opposed to the reason understood as economy. I argue that the opposition in Levinas’ thought between reason and sensibility is problematic and essentially leads to a self-conflicted subject. In fact, it would seem that violence characterizes the subject’s self-relation and, thus, is also inscribed at the base of the social relation. Rather than overcoming a problematic tendency to dualistic thought in philosophy Levinas merely reverses traditional hierarchies of reason/emotion, subject/object and self/other. 

  20. [The current state of the brain-computer interface problem].

    Science.gov (United States)

    Shurkhay, V A; Aleksandrova, E V; Potapov, A A; Goryainov, S A

    2015-01-01

    It was only 40 years ago that the first PC appeared. Over this period, rather short in historical terms, we have witnessed the revolutionary changes in lives of individuals and the entire society. Computer technologies are tightly connected with any field, either directly or indirectly. We can currently claim that computers are manifold superior to a human mind in terms of a number of parameters; however, machines lack the key feature: they are incapable of independent thinking (like a human). However, the key to successful development of humankind is collaboration between the brain and the computer rather than competition. Such collaboration when a computer broadens, supplements, or replaces some brain functions is known as the brain-computer interface. Our review focuses on real-life implementation of this collaboration.

  1. Long-term stability of contour augmentation in the esthetic zone

    DEFF Research Database (Denmark)

    Jensen, Simon S; Bosshardt, Dieter D; Gruber, Reinhard

    2014-01-01

    , recent data from cone beam computed tomography studies have shown the augmented volume to be stable long-term. However, no human histologic data are available to document the tissue reactions to this bone augmentation procedure. METHODS: Over an 8-year period, 12 biopsies were harvested 14 to 80 months......BACKGROUND: Contour augmentation around early-placed implants (Type 2 placement) using autogenous bone chips combined with deproteinized bovine bone mineral (DBBM) and a collagen barrier membrane has been documented to predictably provide esthetically satisfactory clinical outcomes. In addition...... after implant placement with simultaneous contour augmentation in 10 patients. The biopsies were subjected to histologic and histomorphometric analysis. RESULTS: The biopsies consisted of 32.0% ± 9.6% DBBM particles and 40.6% ± 14.6% mature bone. 70.3% ± 14.5% of the DBBM particle surfaces were covered...

  2. Computer-Related Task Performance

    DEFF Research Database (Denmark)

    Longstreet, Phil; Xiao, Xiao; Sarker, Saonee

    2016-01-01

    The existing information system (IS) literature has acknowledged computer self-efficacy (CSE) as an important factor contributing to enhancements in computer-related task performance. However, the empirical results of CSE on performance have not always been consistent, and increasing an individual......'s CSE is often a cumbersome process. Thus, we introduce the theoretical concept of self-prophecy (SP) and examine how this social influence strategy can be used to improve computer-related task performance. Two experiments are conducted to examine the influence of SP on task performance. Results show...... that SP and CSE interact to influence performance. Implications are then discussed in terms of organizations’ ability to increase performance....

  3. Computational Methods in Stochastic Dynamics Volume 2

    CERN Document Server

    Stefanou, George; Papadopoulos, Vissarion

    2013-01-01

    The considerable influence of inherent uncertainties on structural behavior has led the engineering community to recognize the importance of a stochastic approach to structural problems. Issues related to uncertainty quantification and its influence on the reliability of the computational models are continuously gaining in significance. In particular, the problems of dynamic response analysis and reliability assessment of structures with uncertain system and excitation parameters have been the subject of continuous research over the last two decades as a result of the increasing availability of powerful computing resources and technology.   This book is a follow up of a previous book with the same subject (ISBN 978-90-481-9986-0) and focuses on advanced computational methods and software tools which can highly assist in tackling complex problems in stochastic dynamic/seismic analysis and design of structures. The selected chapters are authored by some of the most active scholars in their respective areas and...

  4. Computer-aided diagnosis in radiological imaging: current status and future challenges

    Science.gov (United States)

    Doi, Kunio

    2009-10-01

    Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. Many different types of CAD schemes are being developed for detection and/or characterization of various lesions in medical imaging, including conventional projection radiography, CT, MRI, and ultrasound imaging. Commercial systems for detection of breast lesions on mammograms have been developed and have received FDA approval for clinical use. CAD may be defined as a diagnosis made by a physician who takes into account the computer output as a "second opinion". The purpose of CAD is to improve the quality and productivity of physicians in their interpretation of radiologic images. The quality of their work can be improved in terms of the accuracy and consistency of their radiologic diagnoses. In addition, the productivity of radiologists is expected to be improved by a reduction in the time required for their image readings. The computer output is derived from quantitative analysis of radiologic images by use of various methods and techniques in computer vision, artificial intelligence, and artificial neural networks (ANNs). The computer output may indicate a number of important parameters, for example, the locations of potential lesions such as lung cancer and breast cancer, the likelihood of malignancy of detected lesions, and the likelihood of various diseases based on differential diagnosis in a given image and clinical parameters. In this review article, the basic concept of CAD is first defined, and the current status of CAD research is then described. In addition, the potential of CAD in the future is discussed and predicted.

  5. A person is not a number: discourse involvement in subject-verb agreement computation.

    Science.gov (United States)

    Mancini, Simona; Molinaro, Nicola; Rizzi, Luigi; Carreiras, Manuel

    2011-09-02

    Agreement is a very important mechanism for language processing. Mainstream psycholinguistic research on subject-verb agreement processing has emphasized the purely formal and encapsulated nature of this phenomenon, positing an equivalent access to person and number features. However, person and number are intrinsically different, because person conveys extra-syntactic information concerning the participants in the speech act. To test the person-number dissociation hypothesis we investigated the neural correlates of subject-verb agreement in Spanish, using person and number violations. While number agreement violations produced a left-anterior negativity followed by a P600 with a posterior distribution, the negativity elicited by person anomalies had a centro-posterior maximum and was followed by a P600 effect that was frontally distributed in the early phase and posteriorly distributed in the late phase. These data reveal that the parser is differentially sensitive to the two features and that it deals with the two anomalies by adopting different strategies, due to the different levels of analysis affected by the person and number violations. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Mental workload and cognitive task automaticity: an evaluation of subjective and time estimation metrics.

    Science.gov (United States)

    Liu, Y; Wickens, C D

    1994-11-01

    The evaluation of mental workload is becoming increasingly important in system design and analysis. The present study examined the structure and assessment of mental workload in performing decision and monitoring tasks by focusing on two mental workload measurements: subjective assessment and time estimation. The task required the assignment of a series of incoming customers to the shortest of three parallel service lines displayed on a computer monitor. The subject was either in charge of the customer assignment (manual mode) or was monitoring an automated system performing the same task (automatic mode). In both cases, the subjects were required to detect the non-optimal assignments that they or the computer had made. Time pressure was manipulated by the experimenter to create fast and slow conditions. The results revealed a multi-dimensional structure of mental workload and a multi-step process of subjective workload assessment. The results also indicated that subjective workload was more influenced by the subject's participatory mode than by the factor of task speed. The time estimation intervals produced while performing the decision and monitoring tasks had significantly greater length and larger variability than those produced while either performing no other tasks or performing a well practised customer assignment task. This result seemed to indicate that time estimation was sensitive to the presence of perceptual/cognitive demands, but not to response related activities to which behavioural automaticity has developed.

  7. A Combined Experimental and Computational Approach to Subject-Specific Analysis of Knee Joint Laxity

    Science.gov (United States)

    Harris, Michael D.; Cyr, Adam J.; Ali, Azhar A.; Fitzpatrick, Clare K.; Rullkoetter, Paul J.; Maletsky, Lorin P.; Shelburne, Kevin B.

    2016-01-01

    Modeling complex knee biomechanics is a continual challenge, which has resulted in many models of varying levels of quality, complexity, and validation. Beyond modeling healthy knees, accurately mimicking pathologic knee mechanics, such as after cruciate rupture or meniscectomy, is difficult. Experimental tests of knee laxity can provide important information about ligament engagement and overall contributions to knee stability for development of subject-specific models to accurately simulate knee motion and loading. Our objective was to provide combined experimental tests and finite-element (FE) models of natural knee laxity that are subject-specific, have one-to-one experiment to model calibration, simulate ligament engagement in agreement with literature, and are adaptable for a variety of biomechanical investigations (e.g., cartilage contact, ligament strain, in vivo kinematics). Calibration involved perturbing ligament stiffness, initial ligament strain, and attachment location until model-predicted kinematics and ligament engagement matched experimental reports. Errors between model-predicted and experimental kinematics averaged ligaments agreed with literature descriptions. These results demonstrate the ability of our constraint models to be customized for multiple individuals and simultaneously call attention to the need to verify that ligament engagement is in good general agreement with literature. To facilitate further investigations of subject-specific or population based knee joint biomechanics, data collected during the experimental and modeling phases of this study are available for download by the research community. PMID:27306137

  8. Short-term effect of topical antiglaucoma medication on tear-film stability, tear secretion, and corneal sensitivity in healthy subjects.

    Science.gov (United States)

    Terai, Naim; Müller-Holz, Matthias; Spoerl, Eberhard; Pillunat, Lutz E

    2011-01-01

    The purpose of this study was to investigate the short-term effect of topical antiglaucoma medication on tear-film stability, tear secretion, and corneal sensitivity in healthy subjects. In this prospective, double-blind crossover trial, break-up time and basal secretion (Jones test) were measured 60 minutes before, and 30, 60, and 90 minutes after topical antiglaucoma drop application in 30 healthy subjects. Corneal sensitivity was measured 60 minutes before, and five, 10, and 15 minutes after drop application using a Cochet-Bonnet esthesiometer. Reduction of break-up time in the latanoprost group was -23.8% after 30 minutes (P = 0.21), -26.7% after 60 minutes (P = 0.03) and -51.4% after 90 minutes (P ≤ 0.003), which was statistically significant. Reduction of break-up time in all other treatment groups was not statistically significant. The Jones test revealed a significant reduction of basal secretion after application of brimonidine (-17.8%, P = 0.002; -22.5%, P < 0.001; -30.5%, P < 0.001), followed by apraclonidine (-10%, P = 0.06; -20.1%, P = 0.02; -22.1%, P = 0.002), latanoprost (-2.4%, P = 0.64; -18.6%, P = 0.001; -20.1%, P = 0.001) and dorzolamide (-0.5%, P = 0.9; 14.3%, P = 0.018; -17.3%, P = 0.004) at 30, 60, and 90 minutes after drop application. Reduction of basal secretion in all other treatment groups was not statistically significant. Latanoprost showed the most statistically significant reduction in break-up time, and brimonidine showed the most significant reduction in basal secretion of all the glaucoma medications used in this study. In conclusion, our data may be helpful for treatment decisions in glaucoma patients who also suffer from ocular surface problems.

  9. Term breech delivery in The Netherlands

    NARCIS (Netherlands)

    Rietberg, C.C.

    2006-01-01

    The management of the term breech delivery has been a subject of discussion for many years. Only a few randomized trials had been performed on outcome in relation to the mode of delivery in case of breech position. In october 2000 the results of the Term Breech Trial (TBT) were published, in which

  10. Computer Game Play as an Imaginary Stage for Reading: Implicit Spatial Effects of Computer Games Embedded in Hard Copy Books

    Science.gov (United States)

    Smith, Glenn Gordon

    2012-01-01

    This study compared books with embedded computer games (via pentop computers with microdot paper and audio feedback) with regular books with maps, in terms of fifth graders' comprehension and retention of spatial details from stories. One group read a story in hard copy with embedded computer games, the other group read it in regular book format…

  11. Touchable Computing: Computing-Inspired Bio-Detection.

    Science.gov (United States)

    Chen, Yifan; Shi, Shaolong; Yao, Xin; Nakano, Tadashi

    2017-12-01

    We propose a new computing-inspired bio-detection framework called touchable computing (TouchComp). Under the rubric of TouchComp, the best solution is the cancer to be detected, the parameter space is the tissue region at high risk of malignancy, and the agents are the nanorobots loaded with contrast medium molecules for tracking purpose. Subsequently, the cancer detection procedure (CDP) can be interpreted from the computational optimization perspective: a population of externally steerable agents (i.e., nanorobots) locate the optimal solution (i.e., cancer) by moving through the parameter space (i.e., tissue under screening), whose landscape (i.e., a prescribed feature of tissue environment) may be altered by these agents but the location of the best solution remains unchanged. One can then infer the landscape by observing the movement of agents by applying the "seeing-is-sensing" principle. The term "touchable" emphasizes the framework's similarity to controlling by touching the screen with a finger, where the external field for controlling and tracking acts as the finger. Given this analogy, we aim to answer the following profound question: can we look to the fertile field of computational optimization algorithms for solutions to achieve effective cancer detection that are fast, accurate, and robust? Along this line of thought, we consider the classical particle swarm optimization (PSO) as an example and propose the PSO-inspired CDP, which differs from the standard PSO by taking into account realistic in vivo propagation and controlling of nanorobots. Finally, we present comprehensive numerical examples to demonstrate the effectiveness of the PSO-inspired CDP for different blood flow velocity profiles caused by tumor-induced angiogenesis. The proposed TouchComp bio-detection framework may be regarded as one form of natural computing that employs natural materials to compute.

  12. Mobile Learning According to Students of Computer Engineering and Computer Education: A Comparison of Attitudes

    Directory of Open Access Journals (Sweden)

    Deniz Mertkan GEZGIN

    2018-01-01

    Full Text Available Mobile learning has started to perform an increasingly significant role in improving learning outcomes in education. Successful and efficient implementation of m-learning in higher education, as with all educational levels, depends on users’ acceptance of this technology. This study focuses on investigating the attitudes of undergraduate students of Computer Engineering (CENG and Computer Education and Instructional Technology (CEIT departments in a Turkish public university towards m-learning from three perspectives; gender, area of study, and mobile device ownership. Using a correlational survey method, a Mobile Learning Attitude Scale (MLAS was administered to 531 students, analysis of which revealed a positive attitude to m-learning in general. A further investigation of the aforementioned three variables showed a more positive attitude for female students in terms of usability, for CEIT students in terms of advantages, usability and independence, and for those owning a mobile device in terms of usability. An important implication from the findings, among others, is supplementing Computer Engineering curriculum with elective courses on the fundamentals of mobile learning, and/or the design and development of m-learning software, so as to create, in the long run, more specialized and complementary teams comprised of trained CENG and CEIT graduates in m-learning sector.

  13. Energy information data base: subject thesaurus

    International Nuclear Information System (INIS)

    1979-10-01

    The technical staff of the DOE Technical Information Center, during its subject indexing activities, develops and structures a vocabulary that allows consistent machine storage and retrieval of information necessary to the accomplishment of the DOE mission. This thesaurus incorporates that structured vocabulary. The terminology of this thesaurus is used for the subject control of information announced in DOE Energy Research Abstracts, Energy Abstracts for Policy Analysis, Solar Energy Update, Geothermal Energy Update, Fossil Energy Update, Fusion Energy Update, and Energy Conservation Update. This terminology also facilitates subject searching of the DOE energy information data base, a research in progress data base, a general and practical energy information data base, power reactor docket information data base, nuclear science abstracts data base, and the federal energy information data base on the DOE on-line retrieval system, RECON. The rapid expansion of the DOE's activities will result in a concomitant thesaurus expansion as information relating to new activities is indexed. Only the terms used in the indexing of documents at the Technical Information Center to date are included

  14. A Representation-Theoretic Approach to Reversible Computation with Applications

    DEFF Research Database (Denmark)

    Maniotis, Andreas Milton

    Reversible computing is a sub-discipline of computer science that helps to understand the foundations of the interplay between physics, algebra, and logic in the context of computation. Its subjects of study are computational devices and abstract models of computation that satisfy the constraint ......, there is still no uniform and consistent theory that is general in the sense of giving a model-independent account to the field....... of information conservation. Such machine models, which are known as reversible models of computation, have been examined both from a theoretical perspective and from an engineering perspective. While a bundle of many isolated successful findings and applications concerning reversible computing exists...

  15. Building a profile of subjective well-being for social media users.

    Science.gov (United States)

    Chen, Lushi; Gong, Tao; Kosinski, Michal; Stillwell, David; Davidson, Robert L

    2017-01-01

    Subjective well-being includes 'affect' and 'satisfaction with life' (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users' affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p social media language.

  16. Combined influence of media use on subjective health in elementary school children in Japan: a population-based study.

    Science.gov (United States)

    Nakamura, Harunobu; Ohara, Kumiko; Kouda, Katsuyasu; Fujita, Yuki; Mase, Tomoki; Miyawaki, Chiemi; Okita, Yoshimitsu; Ishikawa, Tetsuya

    2012-06-13

    In recent years in Japan, electronic games, home computers, and the internet have assumed an important place in people's lives, even for elementary school children. Subjective health complaints have also become a problem among children. In the present study, we investigated the relationship between media use and health status in elementary school children in Japan. A cross-sectional school-based population survey was conducted in 2009 with a sample of fourth-, fifth-, and sixth-grade children (age range: 10-12 years old) in elementary schools in Japan (n = 3,464). Self-reported health, lifestyle habits, and time spent using media were assessed. The use of games, television, and personal computers was significantly associated with lifestyle (p media used for more than 1 hour was, the higher the odds ratio of the association of media use with unhealthy lifestyle and subjective health complaints was. The plural use of these media had stronger associations with unhealthy lifestyle and subjective health complaints. Game, television, and personal-computer use were mutually associated, and the plural use of these media had stronger associations with unhealthy lifestyle and subjective health complaints. Excessive use of media might be a risk for unhealthy lifestyle and subjective health complaints.

  17. Canadian conference on electrical and computer engineering proceedings. Congres canadien en genie electrique et informatique

    Energy Technology Data Exchange (ETDEWEB)

    Bhargava, V K [ed.

    1993-01-01

    A conference was held on the subject of electrical and computer engineering. Papers were presented on the subjects of artificial intelligence, video, signal processing, radar, power electronics, neural networks, control, computer systems, transportation electronics, software tools, error control coding, electrothermal phenomena, performance evaluation of computer systems, wireless communication, satellite communication, very large scale integration, parallel processing, pattern recognition, telephony, graphs and algorithms, multimedia, broadcast systems, remote sensing, computer networks, modulation and coding, robotics, computer architecture, spread spectrum, image processing, microwave circuits, biomedical engineering, specification and verification, image restoration, communications networks, computer-aided design, drives, energy systems, expert systems, and optics. Separate abstracts have been prepared for 56 papers from the conference.

  18. Analyzing Subject Disciplines of Knowledge Originality and Knowledge Generality for Library & Information Science

    Directory of Open Access Journals (Sweden)

    Mu-Hsuan Huang

    2007-12-01

    Full Text Available This study used bibliometric methods to analyze subject disciplines of knowledge originality and knowledge generality for Library and Information Science (LIS by using citing and cited documents from 1997 to 2006. We found that the major subject disciplines of knowledge originality and generality are still LIS, and computer science and LIS interact and influence each other closely. It is evident that number of subject disciplines of knowledge originality is higher than that of knowledge generality. The interdisciplinary characteristics of LIS are illustrated by variety areas of knowledge originality and knowledge generality. Because the number of received subject disciplines is higher than that of given subject disciplines, it suggests that LIS is an application-oriented research area. [Article content in Chinese

  19. 5 CFR 890.101 - Definitions; time computations.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Definitions; time computations. 890.101....101 Definitions; time computations. (a) In this part, the terms annuitant, carrier, employee, employee... in section 8901 of title 5, United States Code, and supplement the following definitions: Appropriate...

  20. Computer Series, 38.

    Science.gov (United States)

    Moore, John W., Ed.

    1983-01-01

    Discusses numerical solution of the one-dimension Schrodinger equation. A PASCAL computer program for the Apple II which performs the calculations is available from the authors. Also discusses quantization and perturbation theory using microcomputers, indicating benefits of using the addition of a perturbation term to harmonic oscillator as an…